Navigating in unfamiliar environments is challenging for most people, especially for individuals with visual impairments. While many personal navigation tools have been proposed to enable independent indoor navigation, they have insufficient accuracy (e.g., 5-10 m), do not provide semantic features about surroundings (e.g., doorways, shops, etc.), and may require specialized devices to function. Moreover, the deployment of many systems is often only evaluated in constrained scenarios, which may not precisely reflect the performance in the real world. Therefore, we have designed and implemented NavCog3, a smartphone-based indoor navigation assistant that has been evaluated in a 21, 000 m2 shopping mall. In addition to turn-by-turn instructions, it provides information on landmarks (e.g., tactile paving) and points of interests nearby. We first conducted a controlled study with 10 visually impaired users to assess localization accuracy and the perceived usefulness of semantic features. To understand the usability of the app in a real-world setting, we then conducted another study with 43 participants with visual impairments where they could freely navigate in the shopping mall using NavCog3. Our findings suggest that NavCog3 can open a new opportunity for users with visual impairments to independently find and visit large and complex places with confidence. Copyright is held by the owner/author(s).