At last week’s press conference my colleague Hal Reid asked if the newly acquired Vexcel team would be working with the SQL Server team. The response was a less than stellar “oh we can have you speak to that team.” I suspect others on the call thought it an odd question. I didn’t.
I’m not sure if Reid was thinking of storage or image recognition issues, but he was 100% on target, as Barbara Darrow writes at CRN in her coverage of the Vexcel acquisition.
Last month, Paul Flessner, Microsoft’s senior vice president of data and storage platforms, said one priority is expanding the current data store and database functions to handle not just text and tables, but images and sounds.
One stumbling block thus far has been a weakness in pattern- and image- recognition algorithms. That is an area to be addressed by third parties, and/or by Microsoft itself, he noted.
Data stores and databases will have to be redesigned or retrofitted to handle these content-rich data types and make them searchable on more than text tags, observers said. In Microsoft’s case, the ability to handle that data and search it will come in the “Katmai” timeframe. Katmai is the next-gen SQL Server, expected by sources to debut in 2008.
Also note that just last week Overwatch acquired pattern recognition experts Visual Learning Systems. That’s the next frontier for folks like Microsoft to pursue. Look for more interest in companies with that sort of expertise.
by Adena Schutzberg on 05/08 at 07:24 AM |
The newest book from ESRI Press is titled “Think Globally, Act Regionally: GIS and Data Visualization for Social Science and Public Policy Research” (ESRI Press). An article at SF (San Francisco) State News explains how it got written .
The textbook is part of a project to introduce GIS—which uses computer-generated maps to make comparisons and projections—into urban studies and urban planning programs nationwide. The Space, Culture, and Urban Policy Project is funded by a three-year, $432,000 grant from the National Science Foundation.
The school’s Professor of Urban Studies Richard LeGates wrote the book, apparently under the grant, then had ESRI publish it.
I, for one, had no idea how all that worked. But, apparently, that’s one way.
by Adena Schutzberg on 05/08 at 07:11 AM |
GIS is really an “everyday” sort of thing. Consider this small article in the Commercial Appeal (Memphis, TN). I turns out if you live in the “Zip Codes [sic] that are split between two municipalities or between a city and an unincorporated area” you are more likely not to have paid to register your car in the city. Finding people in that situation will net the city some $54,000 in the coming year. The GIS helps, per the final line of the article:
Henning [deputy administrator in the county clerk’s office] said analysts in her office started reviewing the Memphis area last fall, when city officials called in March alerting them that they could do the work faster using their geographic information system, or GIS.
As an aside, if you do live in Memphis you pay: “a $30 city fee annually to register their vehicles, in addition to the $50 wheel tax and the $24 state registration fee.” Wheel tax? How quaint! When I lived in Pennsylvania few years ago, I believe it was my township that had a “head tax,” which I explained to friends “you had to pay if you had a head.”
by Adena Schutzberg on 05/08 at 07:01 AM |
At MapInfo’s MapWorld last week, Tom Villani, vice president of global alliances at Microstrategy, warned that you had better start understanding and investing in good BI/LI (business intelligence/location intelligence) solutions lest you get drowned in a data glut. "Data storage is too cheap…no one is throwing out any old data," he said. Villani also said that his company just purchased 4 terabytes of data storage for only $25000. The consequences? Not only will better search tools be needed but a way to filter, refine, and report information to decision makers will be absolutely essential if data warehouse archives remain online indefinitely. Geospatial data will be no exception and ways to mine those data with better tools will be a requirement. Companies like Teradata seem well positioned in the data warehousing market and BI vendors are collaborating with them in ways to access huge volumes of transactions.
by Joe Francica on 05/08 at 01:58 AM |
Speaking at MapInfo’s MapWorld Conference last week, Gail McGiffin, Managing Partner of the underwriting practice at Accenture and formerly with Chubb, said that “In property and casualty space we are data rich and information poor. Information sits in ancient data warehouses…we must get better data into the front office.” She also commented on the fact that the use of legacy systems is not a small problem. “Its not easy to get to a legacy platform,” she said. The insurance companies seem to have a data quality and consistency problem starting with just finding the right data at the time it is needed. A host of challenges arose in the insurance industry when confronted with major natural disasters such as Hurricane Andrew in 1992 and the World Trade Center attacks of 9/11 in terms of measuring their exposure. However, McGiffen said that it is often the smaller risks that pop out from doing a location intelligence analysis that may expose more risk than the major catastrophic events like a Hurricane Katrina. She also stressed that reusability of data across the enterprise was essential and that bringing data into a centralized data store would be extremely beneficial. She foresees the use of more granular data for getting insights into customer behavior and buying patterns. There will come a time when measuring how hard you hit the breaks or how fast you drive will be monitored by insurance companies in order to do a more thorough job of assessing how to underwrite your specific auto insurance policy. Scary!
by Joe Francica on 05/08 at 01:08 AM |