Need we say more?
We’re again thrilled to be sponsoring the Tableau Customer Conference. Next week, September 8-11, TCC13 will be in full stride at the Galylord National Resort and Convention Center in Washington, DC. We’ll be running a session, hosting a geo drink-up, exhibiting and giving things away and hope you can make it! Details below and please be sure to follow us on Twitter for the latest.
Geographic Analysis in Tableau: from Business Requirements to User Needs
Because Tableau natively leverages mapping capabilities from Urban Mapping, there is often a semantic and technical gap in addressing GIS needs. In this interactive session, Data Strategy & Visualization Consultant Leigh Fonseca and Urban Mapping President Ian White will dissect real-world case studies and translate business needs into actionable Tableau-driven results. During this hands-on session, you will learn how to:
- dynamically map business measures
- leverage the goodness of dual axis maps
- use maps to evaluate sales by custom regions
- report on marketing efficacy by industry recognized regions
- report on election results by census track
- view satellite imagery with your detailed city level data
- customize your map views, boundaries and colors
If you’ve ever wanted to know more about geocoding, postal codes, great arcs, choropleth maps, heat maps and more, this session is for you.
We’re looking forward to seeing old friends, making new ones and having a great time– see you there!
We’re thrilled to be making a repeat appearance at the annual Wolfram Alpha Summit in Washington DC. This year Ian’s talk will focus on the all-important (but oft-neglected) subject of metadata. Please join us September 5-6 in Washington DC at the Park Hyatt!
We’re at the Tableau customer conference in London and just announced our CDN. This is exciting news if you are located, um, anywhere in the world. Depending on location, Tableau map performance will range from 2 to 10 times faster. This is because we can’t move the internet, but we can move our content closer to customers.
As Tableau’s official mapping partner, Urban Mapping allows Tableau customers to access base maps, create custom geographic overlays and other visualizations, including demographics and high-resolution imagery, and utilize additional data from the Mapfluence on-demand data catalog, all within Tableau products. Some choice words from Tableau:
“As a global software company committed to pushing the envelope with visualization technologies, mapping is a must have,” commented James Eiloart, Vice President of Europe, Middle East and Asia at Tableau. “With the CDN, Urban Mapping is not only helping deliver value-added capabilities to our customers, they have also taken the extra step to develop an innovative delivery platform to provide those capabilities as fast as possible.”
Because Mapfluence maintains native access to Tableau, it offers higher performance, reliability and service offerings over legacy mapping technologies. Interested customers can access a free evaluation of enhanced mapping capabilities in Tableau. These capabilities include:
When The Guardian broke the story about the NSA demanding ‘telephony metadata’ from Verizon, a new word was introduced into the public lexicon. In the world of maps and geographic data, metadata gets a bad rap. It’s generally perceived as a pain in the ass– something that must be tended to, like a perpetually leaky bike tire or cleaning up your room. At Urban Mapping, we’ve always viewed this differently. Poor documentation or an inability to readily know things about data, especially geographic information, is costly.
Since the dawn of library science at the Library of Alexandria in Egypt, where Callimachus in the third century BC conceived of the first bibliographic system (Pinakes), metadata developed as a way to catalog key elements of printed works and make them easily searchable. Unfortunately this also introduced an unintended consequence of divorcing metadata from data.
The next significant development in metadata was two millennia later. In 1595 AD, Johan van der Does of Leiden University published Nomenclator, the first definitive publication of library holdings. While this index was fairly crude, it took a few thousand years to arrive. Next up was Melville Dewey who created The Dewey Decimal System to organize all knowledge into ten main classes (further subdivided into ten divisions, each division into ten sections). This approach allowed for infinite hierarchy. Other systems followed, such as the Universal Decimal Classification and Library of Congress Classification.
Fast-forward a few decades. Libraries, archives, bookstores and other repositories of knowledge are filled to the brim with card catalogs and the like. Tremendous human effort was dedicated to manually draft, distribute and maintain indices about their collections. What to do with these massive storehouses of index cards? Thankfully the information age helped to solve the real estate problem by creating digital archives of metadata that could be searched. Beginning in the late-1960s, the OCLC took the lead to centralize the digitization and storage of metadata for all types of content. Content had become completely divorced from that which it describes– metadata and data had undergone a bifurcation, simultaneously advancing society and holding it back. If I was interested in, say, a book about musical scores and want to know more about New York, New York, I might be SOL. Searching by “Sinatra” might help me get what I want, but the limitations of searching metadata remained a function of that which was indexed.
Then the Internet happened. Moore’s Law took root and processing power went exponential, storage costs dropped like rocks. The cost to purge data became greater than to maintain. Bookstores, newspapers, libraries and anything dead tree oriented faced irrelevance. After a long separation, metadata was reunited with data, but not from the world of library science. Google Books and Amazon’s Search Inside are great examples of bringing content together with data, allowing users to simultaneously perform full text searches and query metadata.
Contrast this with the world of geospatial data, where metadata remained off to the side, completely divorced from content. This is effectively not much more advanced or useful than the days of card catalogs. This is why we take it so seriously at Urban Mapping. In the next few months we’ll be unveiling a significant improvement over current geospatial metadata, but more on that later…
Ok, what does this have to do with domestic surveillance you are thinking? With the NSA demanding telephony metadata from Verizon and President Obama assuring Americans that nobody is listening to your phone calls, what exactly is the big deal? Below is a list of what could be provided. On the left is the demand, the center column is a list of derivative information (meta-metadata?) that could be compiled based on logs from Verizon, and the right column indicates what this could mean.
The bottom line in Mapland is a mantra we’ve had in place for years: One person’s metadata is another person’s data. The Verizon-NSA issue is case study #1 in how metadata plays a critical role in surfacing actionable insights. In this case, political and security considerations are paramount, but it very clearly illustrates the concept. It’s been taken from wonky to pedestrian in a matter of days and hopefully future uses of metadata cease containing the term in quotes.