The 2007 Amgen Tour of California started this weekend. It looks like some of my former colleagues at Quokka who are now at
Macromedia Adobe have been working on the realtime race tracker.
Live Race Coverage
Francis Potter from Adobe sent over the annoucement from Yottapixel:
An engineering team at Adobe has been working for a few months on the Tracker. Many of the team members worked in 1998-2001 at Quokka Sports, a dot.com-era startup which pushed the envelope at the time with live sports coverage. In many ways, the Amgen Tour Tracker represents the culmination of what that team was trying to accomplish years ago.
Some of the stylistic elements in the Tour Tracker pick up where Quokka left off. The live video feed resizes to fit the user’s browser — even if that means the video is pixelated. Data, commentary, and image thumbnails are overlayed on the edges of the video, sometimes obscuring the action. There are lots of shades of grey in the interface. The end result delivers a rough, cluttered look which emphasizes the liveness, urgency, and experimental nature of the medium.
The world's moving onto the web, all right:
Ottowa: Scientists will soon start attaching microchips to fish and other marine animals to track their movements around the world's oceans and learn how they are being affected by phenomena such as climate change and overfishing, experts said on Monday.
Canada's Ocean Tracking Network, has gotten a hefty grant to build an ocean-based network of life-based mobile data gatherers. It's an ambitious project—tag a whole slew of marine beasties, drop sensors in the ocean, see where they go, watch this change over time, and use this data to make arguments for policy and as a basis for further scientific research.
The Tagging of Pacific Pelegiacs project has had something like this going for a while now; their Near Real-Time Animal Tracks covers several animal species over a wide range of Pacific Ocean locations. What seems noteworthy about the OCT is their desire (and now mandate) to extend this technology worldwide, with an expressly political aim. Exciting.
When the Exploratorium and Scott Snibbe approached us in 2005 to visualize realtime GPS positions of San Francisco taxi cabs in the Cabspotting project, we leapt at the chance. Taxi data is highly dynamic, mappings of it change noticeably from viewing to viewing, has an easy reference to the real workd, and the data lends itself to visually inspiring and meaningful work with the lightest of touches from us; all things we like.
It also suggests a wide range of potential new questions to answer and new things to make. Adam Millard-Ball's suggested applications for just about everyone: taxi drivers (how long is the average wait for a pickup in this neighborhood?), taxi passengers (obviously, where's the nearest taxi, but also do they tend to be here at this time), TLC planners (do we really need more taxis in North Beach? the data'd tell you), and city planners (looks like alot of people are taking cabs at 8:50am from Point A to point B, maybe we should have a taxi stand or ride share there?).
Updates once a minute or so
From the beginning, we made a deliberate decision to map only the data—to let the material we were getting from YellowCab tell the story, and not to rely on any kind of overlay or specific relation to an underlying, pre-generated map. We wanted to see what story the data itself would tell. It resulted in some pretty interesting artifacts; in particular the activity around the Bay Bridge between San Francisco and Oakland.
Since 2001, Stamen has developed a reputation for beautiful and technologically sophisticated projects in a diverse range of commercial and cultural settings. We work and play with a surprising and growing range of collaborators: news media, financial institutions, artists and architects, car manufacturers, design agencies, museums, technology firms, political action committes, and universities.
Find out more about who Stamen are, or view our client work and research projects.
Last fall, San Francisco Museum of Modern Art (SFMOMA) reached out to Stamen to help with an exciting project: imagining the future of their museum collection API, which was in it’s very early stages. The ask was twofold: to do some initial experiments with the API to see what kinds of walls we hit, and to host an event bringing together artists, designers and museum staff to do some imagining of their own.
Our initial pingings of the API resulted in works that were practical as well as experimental data art, like this representation of the lifespan of artists from the collection:
The event, which we called Art + Data Day, took place at the new Gray Area Art and Technology Theater in San Francisco on Oct. 30, 2014. Facilitated by Chacha Sikes (above), the event was formatted as a participatory design workshop, focusing on collaboration and problem solving (rather than competition and speed) as a way of testing the alpha API. The event brought together museum administrators and staff – like their Head of Digital Keir Winesmith – as well as artists, technologists, and designers – like Scott Murray and Micah Scott (shown below) – to explore what they could do with that iteration of the data.
The group broke into four teams, each with a different focus based on skills and interest:
- Team #Selfie explored how images taken by museum visitors could provide new information about the relationships people have to an artwork and how it’s perceived.
- Team Context investigated how the SFMOMA API might be paired with external data sets to deliver related content that would provide greater context about artworks and artists.
- Team PixelMasher thought about what could be learned from the visual context and pixels within the images represented in the collection.
- Remotely, John Higgins, SFMOMA's information architect, submitted a project that inquired whether artwork titles convey a specific emotion, revealing the sentiment of an artist across all of his or her works contained in SFMOMA's collection over time.
At the end of the day, everyone walked away feeling like they had contributed to something exciting, which they had. SFMOMA also had some clear takeaways around both the abilities and limitations of the API.
To get all the juicy details about Art + Data Day, the teams, and what they created, take a look at the full report on the SFMOMA Lab’s blog and be sure to watch this Art + Data Day video.
This week we will be presenting this work at Museums and the Web 2015 in Chicago, and we couldn’t be more excited to meet others working at this intersection of art, design, and data.
Do you want Stamen to help with a participatory design workshop? Get in touch!
Today we launched CaliParks.org for the state’s Parks Forward Commission. CaliParks.org is the first statewide parks search engine that brings together expert-level park boundary and management data with social media content from Instagram, Flickr, Twitter, and Foursquare.
Our mission was to create a statewide search engine for parks that would show you information about parks regardless of agency, so you don’t have to know (or care) who runs the park you want to visit. You just want to know where it is, what you can do there, and how to get there.
With the definitive California Protected Areas Database (calands.org), CaliParks.org can tell you exactly which of the state’s 11,826 public parks are near you.
With activities data from Hipcamp.com and GreenInfo.org, CaliParks.org will help you find out where you can do your favorite activities, from rock climbing to shooting hoops.
And you can see what people are doing in those parks from photos contributed by more than half-a-million people, updated from Instagram and Flickr daily. Counts of Tweets and Foursquare check-ins within the parks give a sense of overall popularity and use.
Tabulating those numbers in recent weeks reaffirmed a core tenet of this project: Parks are social. And diverse Californians will see themselves in parks. We know because we see them on social media. And showing that creates an invitation to share these public spaces. Indeed, the fundamental design principle for this project is that open data and social media can come together to create deeply useful tools to help people get out and enjoy nature together in their daily lives.
Based on R&D work we’d done to harvest social media posted within every public park in the state (parks.stamen.com), we were engaged by Resources Legacy Fund and the state of California’s Parks Forward Commission, charged with charting the future of public parks in the nation’s most populous state.
The products of our previous research work — complex maps of real-time social media that are beautiful in their own right — became the infrastructure upon which we’ve built a simple, powerful tool.
The first view of the site is a prominent search bar where you can enter a location (anything Google can geocode) or hit the “locate me” button to get a list of parks near you. You can also tap one of the “story blocks” to get nearest-first curated lists we’ve assembled around several different themes.
From there, you can narrow and tailor your search to see just the activities you’re most interested in.
At launch, we have activities data and official park URLs for 816 large wilderness parks, plus key city park activity data for 5,447 urban parks across the state. And we have a web application that’s built for mobile. An interface available in English and Spanish (a big shout-out to Latino Outdoors for help on translation and user testing!). All tied into native routing software available on every smartphone.
But it’s really just a start. There’s a lot more data that could be harvested, and a lot more service to provide to California, and potentially other states and nations!
We especially hope to expand on the key idea that using social media and open data as a content database both radically opens up the parks conversation and also substantially reduces the risk of launching a new application.
A radically open conversation means that sometimes we might hear and see things that make us uncomfortable. But with daily image harvesting, the solution is clear: Head out to the park you love and post some images! (Just make sure your location services are enabled.)
Result: A richer and more engaging view about that park on CaliParks.org.
But more importantly: A richer and more engaging representation of that park across diverse social media conversations.
It might have been safer to build a walled garden: A site where we could control all the content all the time. But securing images for nearly 12,000 parks would be a monumental task.
And if we’d gone down that road, tending that walled garden would do little for the larger conversation around parks in California. And the more effort that goes into a walled garden, the greater the risk: Walled gardens can be beautiful. They also tend to be short-lived.
By tapping into the rich social stream created by hundreds of thousands of people living their lives and sharing their experiences, we hope CaliParks.org can be a key tool in building the parks conversation and community over the long haul.
Now, it’s time to get outside. Find and share your next adventure at CaliParks.org!