February 2010


Project Proposals and Spring 201025 Feb 2010 11:33 am

What problem is your project aimed at solving? Alternatively, why would someone want to use what you are making in your project?

The first immediate goal of the American Recovery and Reinvestment Act of 2009 is to “create new jobs as well as save existing ones.”

While Emily’s initial project idea was to compare jobs created and saved vs. unemployment, some initial research proved that the exact number of jobs created per grant or contract is difficult to determine with the level of regularity and reliability needed for creating a machine-readable, automated data mashup.  A good example is the problem of bulk grants.  The Regents of the University of California have been earmarked to receive a grant of $716 million dollars

which will save some 38,923.98 jobs, but the distribution of those jobs within the state of California is not clearly delineated, and furthermore, the number of jobs is a combination of full-time employees, part-time employees, sub-contractors, and vendor hours, again unspecified by county placement.

Due to these problems–the lack of consistency and specificity needed to achieve this initial project goal, we will be focusing on a slightly different question within the manageable scope of Alameda County, California:  How has the ARRA affected (or not affected) unemployment in Alameda County, California?  If the ARRA has been “effective,” we would hope to see the level of employment decreasing or remaining steady over time as funds are dispersed.  Of course it is difficult if not impossible to make any causal conclusions by comparing this data as opposed to examining job-for-job data (and taking into account other indices of the recovering economy), which I have already mentioned is an intractable problem until more granular data can be reliable extracted.

Is your project doable given the constraints of time, our starting knowledge, etc?

Yes, we think so.  We’ve narrowed the geographic scope from the entire U.S. to Alameda County and will only be needing two sources of data:

  1. Month-by-month unemployment data in Alameda County from January 2009 to the present
  2. The start date of all ARRA grants/contracts dispersed in Alameda County to date

The visualization portion will be tricky, but I am confident that we have the resourcefulness necessary to complete this part of the project.

What interface are you imagining? Is it a web, desktop, or mobile application? What platform are you running?

We want to build a web visualization which can be viewed with the Firefox web browser.

What data or services are planning to bring together?

  1. Unemployment data from the Bureau of Labor Statistics (http://www.bls.gov/lau/#tables)
  2. ARRA Project award start dates and amounts from http://www.recovery.ca.gov/HTML/RecoveryImpact/map.shtml?county=Alameda
  3. ClearMaps or Tableau or Google Visualization API

What’s your plan for getting the data? Often the data you might want might not be easily available.

Downloading the latest unemployment and ARRA files available and filtering for Alameda county.

Do the APIs you plan to use actually support the functionality that you need in your application?

No, we are not using any APIs, except for maybe the Google Visualization API, but that is not a data source.

What programming language do you plan to use?

Either Python or PHP, depending upon which one is easier to use with the data.

Action Plan:

  1. Download and examine unemployment and ARRA data files.
  2. Filter the files by Alameda County.
  3. Find zip codes for Alameda County and extract relevant ARRA data from master file.
  4. Investigate best mapping/visualization tool, hopefully one which will allow us to make monthly step-wise comparisons in unemployment, new project contract start dates, and the percentage of ARRA money spent out of the total currently slated for Alameda County.
  5. Integrate the data sets with the visualization utility.
  6. Celebrate.

Risk Areas/Mitigation Plan:

Right now our biggest risk is scrubbing the ARRA data.  We imagine that not every award will have listed its zip code, zip codes may span more than just Alameda County, or that field may be blank.

Our best mitigation plan might actually be to expand the scope of the project to include all of California, since the award’s state is a more reliable data point.

Spring 201024 Feb 2010 12:36 pm

Day 10 Notes.

Project Proposals and Spring 201024 Feb 2010 12:32 am

– What problem is your project aimed at solving? Alternatively, why would someone want to use what you are making in your project?
My project is not necessarily aimed at solving a problem, but more an interesting tool that might turn out to be useful. I am making this tool so that it can be quick and easy for a user to check how the counties or districts of California compare on different issues (environmental, gay rights, tax reform, etc.), the party they registered for, their age, and possibly more factors (based on how the population voted, registered, or Census data). In addition to quenching a users interest I could imagine a scenario where a user was thinking of moving and wanted to know what areas were fairly well aligned with their political views to help in making the decision. Another scenario would be to quickly find areas in the state that could use more attention during campaign season based off how populations have voted in the past.
– Is your project doable given the constraints of time, our starting knowledge?
This project is doable but likely in a limited fashion. I will take a scaling approach and start with just California split into counties. I’ve still had trouble finding good sources of election result data sets but I have found fairly detailed registration data. This gives me slightly different data to map with but a good amount to work with and build with. If I have to I can just had input state wide vote results to work with the existing data. I’m also sure that my technical ability can keep up with the needs of this project.
– What interface are you imagining? Is it a web, desktop, or mobile application? What platform are you running?
This would be a web based application. A user will be able to select the parameters of the color coded map and then the map will be generated (picture below is edited from a pdf downloaded fromhttp://swdb.berkeley.edu/maps.html to resemble a zoomed in view on a map that was created to measure democrats registered versus republicans registered). If I can get all that running smoothly I would like to add the ability to change options on the fly and have the map recolor itself but that might be beyond my abilities for now.

– What data or services are planning to bring together? Be specific.
I will be bringing together voting, registration, census and other data from different cities, counties and states to be combined to create color coded comparison (temperature) maps.
– What’s your plan for getting the data?
These are files that I can find very separately (pretty much as subdivided as possible) but am still hoping to find in one place. I will also be introducing registration data gained fromhttp://swdb.berkeley.edu/. This registration data will really widen the parameters to create maps based off of. I am currently investigating census data as well as http://www.datamasher.org/ (a recent Sunlight Foundation Apps for America 2 winner) for further data sets.
– Do the APIs you plan to use actually support the functionality that you need in your application? Show how it does so.
This is where most of the complexity of this project will likely come in. I would love it if I could just write a script to grab the information I need, but it looks like I will most likely need to grab the data myself and possibly even parse it myself.
– What programming language do you plan to use?
PHP. I think server side is where I’ll be keeping all my data so that is where I’ll be doing all my computing.
– Break down the project into steps. You can end up changing the steps later, but I want to make sure you have a clear conception on what the steps are.
1) Write script to take data from http://swdb.berkeley.edu/ and return the number hex color code that each district should be colored.
2) Write a script to generate the map based off the resulting colors.
3) Create user interface to select from voter registration data.
4) Find more relevant data to use.
5) Add scripts to generate correct numbers from new data files.
6) Add data to parameters available to users.
7) Repeat from 4 with more data or more states.
– Identify areas of “high risk,” areas that you are uncertain about and/or things that might undermine the entire project. Write about how you are planning to deal with these potential problem areas.
1) I’ll likely have to move to Python if I want to allow for on the fly changing of map parameters.
2) Difficulty in finding the data sets I started out looking for has caused me to start using other data. This data is still interesting but not quite what my original intention was, I would really like to get voting data but hand entry will just prove to time consuming, so if it comes to that I will likely only feature a few options in that category.  This might be solved by allowing users to update data as they see fit, but i would then have to provide an interface to allow user input.
Project Proposals23 Feb 2010 11:27 pm

The aim of this project is to make the ARRA data more meaningful to people
who might find the data reported at recovery.gov difficult to make sense
of, or difficult to relate to their own lives.

A news feed that can provide users with some context for what recovery act recipients are accomplishing locally might help to make the data more accessible to people, although I am still having trouble figuring out how to put together a news feed that could provide meaningful/useful context without overwhelming the context with noise.  Some possible sources might be feeds using google news, yahoo search, google blog search, and the NYT APIs.  The goal would be to combine  news feeds focused on  recovery act recipients with the recipient-reported data at recovery.gov, so that if a user is looking at particular recipients, s/he gets links to related stories.

I am imagining a web interface, maybe starting with a searchable grid
layout similar to the “advanced recipient reported data search” at recovery.gov
http://www.recovery.gov/pages/TextViewProjSummary.aspx?data=recipientAwardsList

but with the retrieved data focused on news feed information, together
with related ARRA funding data.  If possible, I would also like to create a
searchable map interface.

The ARRA data can be downloaded from recovery.gov, or possibly accessed through the RPI SPARQL project.  New York Times data can be accessed through several APIs — the Search and Tags APIs are promising for stories that make the national news, but other sources will be necessary for local coverage outside of the New York metropolitan area.

I am not sure what languages I will need to use for this project– in
addition to (possibly) Python, I guess PHP?

I am also not sure how doable this project will be given time constraints and
my starting knowledge.  I have some coding experience but not in this
context, and very little experience with web architecture or web
interfaces.

Currently I’m exploring different news sources to see what combinations of information can provide useful context for a user looking at recipients in a particular zip code.  In addition to the other difficulties already mentioned,
identifying news sources/searches that would be meaningful to users, without requiring a lot of winnowing on the user’s part, feels like the most important barrier right now.

Project Proposals and Spring 201023 Feb 2010 10:40 pm

Hi all, I am Julian and this is my project proposal.

What problem is your project aimed at solving? Alternatively, why would someone want to use what you are making in your project?

I am thinking about extracting real-time air quality data from the California Air Resources Board and create an air quality map along with real-time traffic data (meshed with Gmap). Combined with various meteorological data such as wind speed, temperature, and precipitation, one could look at the various impacts traffic has on the air quality.

What interface are you imagining? Is it a web, desktop, or mobile application? What platform are you running? Often a rough sketch of the interface can help clarify a lot of issues

The interface would be google map. Since a lot of computation is involved, I would expect the platform to be strictly desktop and web.

What data or services are planning to bring together? Be specific.

I am planning to bring together the data from California Air Resources Board and the Mobile Millennium Project. The Mobile Millennium project “will design, test and implement a state-of-the-art system to collect traffic data from GPS-equipped mobile phones and estimate traffic conditions in real-time. It is a partnership between government, academia, and industry.” [http://traffic.berkeley.edu/theproject.html]

What’s your plan for getting the data. Often the data you might want might not be easily available.

I am already in contact with the Professor responsible for this project in my department and we are sorting things out.

Do the APIs you plan to use actually support the functionality that you need in your application? Show how it does so.

Google map will be able to plot the traffic data as color coded paths. This would show the level of congestion on the road. Air quality data could also be easily shown on the map by pinpointing the sampling station and color coding the pin.

What programming language do you plan to use?

Javascript.

Provide an action plan:

Break down the project into steps. You can end up changing the steps later, but I want to make sure you have a clear conception on what the steps are.

1. Obtain traffic data in a workable format.
2. Analyze and devise a scheme to automate extraction of air quality data.
3. Plot the traffic and air quality data on the same map (trial, using a particular set of data).
4. Plot the traffic and air quality data on the same map, using different sets of data.
5. Create a user interface where date and time could be selected for a specific set of data.
6. Animation of changes in traffic and air quality data over time.
7. Integrate the system for future data extraction.

Highlight what you are currently working on.

Currently I am discussing with the Professor on how to getting the traffic data in a workable format. The data is there, it is just difficult to have it in a google-map-friendly format.

Identify areas of “high risk,” areas that you are uncertain about and/or things that might undermine the entire project. Write about how you are planning to deal with these potential problem areas.

One of the high risk areas is that the traffic data might not even be available in a workable format for me to integrate into the map. In that case, I would either have to look for alternate traffic data or use a different approach in showcasing the air quality data, such as comparing the changes in air quality with the changes in Federal and State air quality standards over the years.

*NOW WORKING ON…*

Writing a javascript to extract csv data files from the California Air Resources Board and put them into a database.

Spring 201023 Feb 2010 09:43 pm

This is more or less the proposal I sent to the class list last week, but it takes into account a few of the things we’ve talked about since then.

What problem is your project aimed at solving? Alternatively, why would someone want to use what you are making in your project?

My project is aimed at solving one of the fundamental problems of street parking in San Francisco neighborhoods: knowing when and where to move your car. Most blocks in neighborhoods with residential permit parking have a two-hour window each week where all cars must be moved off the street for cleaning. The problem arises because the street layout suggests nothing about the pattern of days (Monday cleaning vs. Tuesday vs. Friday) nor the times (8-10 a.m. vs. 6-8 a.m. vs. 9-11 a.m.) when the streets have cleaning.

This is partly about tickets — the cost of even one permit parking violation nearly equals the cost of the street permit for the whole year — but it’s also about efficient use of time. There’s a good amount of strategy involved in parking on city streets: In general, you want to find the space that will allow you the maximum number of days before moving again (i.e. if you’re moving your car on a Thursday morning, a Wednesday space is a gem, while a Friday is just an annoyance). But there are exceptions: Say you know you have to get up early to drive somewhere on Monday morning, so you’ll purposely seek out a Monday space so you don’t have to wake up early twice in the same week. Knowing where, say, all of the Tuesday parking blocks are would help me figure out my parking plan before I even leave the house.

Additionally, I imagine a crowdsourcing element of the project. In many San Francisco neighborhoods, this is a time of heavy construction (partly due to funding from ARRA!) and streets that are normally open for parking have been occasionally blocked for weeks if not months at a time. I imagine pulling in Twitter content (either using geolocation information on tweets or specific hashtags relating to the different parking zones in the city, i.e. #SFparkingS, #SFparkingZ) where residents could share information about which blocks are open and which are closed.

Is your project doable given the constraints of time, our starting knowledge, etc?

After discussing this project in class, I do believe it’s doable, though I’ll probably attack it first on a smaller scale, perhaps just using one of the city-provided KML files to attempt to create this kind of map for a single parking zone at first.

What interface are you imagining? Is it a web, desktop, or mobile
application? What platform are you running?

In the best-case scenario, this would be a website/mobile app combo. However, I believe a lot could be accomplished by an interactive map on the web to start with, especially if any of that information could also be accessed via a web browser on a smartphone. Having it be web-only solves some of the problems I’m trying to fix (I could pull up the list of all the nearest Thursday parking spaces before leaving the house, for example), but it introduces new ones (needing to write down those streets/blocks to have them handy in the car). Still, it’s better than what I have now!

What data or services are you planning to bring together? What’s your plan for getting the data?

The sfdata.org site offers several datasets (mostly KML files) that will likely be of use to me, and I’ve already downloaded most of them. One set offers maps of the street sweeping routes in the city and the days and times each block
(or, more specifically, each side of each block) has cleaning, though at this point I’m not sure how that data can actually be used. Another set lists all the temporary permits that could block off parking spaces, which may be a little tougher to integrate (not all blocked parking would necessarily be residential parking, so I’d need some pretty precise location data, and I’m not sure at this point what exactly these files offer in that respect). With Twitter, I’ve already mentioned the hashtags I could potentially use to collect data.

Do the APIs you plan to use actually support the functionality that you need in your application? Show how it does so.

I now know a little bit more about how to work with the KML files and the Google Maps API, so I do believe it’s possible to integrate the two — and, more importantly, that it’s possible for me to learn enough to integrate the two! That said, I still need to learn how to parse the text information that’s included with the KML files and figure out how to best display it on the map. Is it possible I’d actually end up redrawing the map, using a different color line for each day that there’s street sweeping? Or is there an easier way to mess with the information I already have?

The Twitter piece would clearly be the easiest here, since I could just use the RSS feeds of the hashtags.

What programming language do you plan to use?

Not sure – but I hope that whatever I have in mind can be done in Python + whatever we learn in class, because that’s what I know!

Tentative Action Plan
Step 1: Gather my data from city sources (mostly done)
Step 2: Figure out what that data actually includes and how it can be used
(Step 2a: Understand what information is included in a KML file and how to use it)
Step 3: Learn how to use a mapping API and find out what happens if I try to use a KML file with it.
Step 4: Determine the scope of my project (i.e., does it turn out that these files are so easy to work with that I can cover all the parking zones in the whole city? Or should I narrow my focus to one specific, geographic area?)
Step 5: Attempt to integrate the KML street sweeping data with the map.
Step 6: Attempt to integrate the KML parking permit data with the map.
Step 7: Work on various methods for searching (see “high-risk areas” below)
Step 8: Attempt to integrate Twitter with the map

High-Risk Areas/Questions
1. The KML files might not provide exactly the information I’m looking for, or they need to be parsed in a way I don’t understand yet. For example, now that I’ve looked at one of the KML files in Google Earth, I realize that I have no idea how they’re broken up — the files neither correspond to a parking permit zone nor to a government district. That makes me worry about my plan to just cover one zone, because I’m not sure the KML file I would be using would correspond to anything useful.

2. For this to really work well, I’d need to develop a couple of different ways of searching the information — perhaps pulling up a map of the streets immediately around an address as well as allowing someone to enter an address and pull up the nearest Tuesday or Friday spaces. That means dealing with outputs in different formats – I might want to show a route on a map for the first but pull up a text list for the second.

3. Is the programming required for this going to be totally over my head? From our class discussions, it sounds like figuring out how to do spatial search will be the hardest part hands-down.

Spring 201022 Feb 2010 10:42 am

See Day 9 notes on Ajax and an intro to the Google Maps API.

Spring 201017 Feb 2010 04:18 pm

Day 8 notes have been posted.

Uncategorized10 Feb 2010 12:31 pm

See the notes.

Uncategorized08 Feb 2010 12:20 pm

Notes to be filled out as we go — please post your answers to exercises #1 and #2 as responses to the Day 6 notes.

Next Page »