This past week was our final week to administer surveys and collect data. As we mentioned last week, we have a low response rate for our surveys for a variety of reasons: 1) some phone numbers are no longer in service; 2) former clients are not picking up their phones; and 3) some clients refuse to provide feedback through the survey. We’ve emailed our community leader to meet with him this coming week so that we can look through the limited data we do have and come up with a final approach to our data visualization and deliverable.
What We Observed/Learned
Now that our data collection is coming to a close, we are starting to notice surprising trends in the data. For example, while most clients have experienced a marginal increase in rent payment, they have not experienced a significant change in commute time. Some clients were even able to lower their rent payment when they moved. However, there have been a few cases in which clients have been homeless for up to several weeks, but the challenge for us now is conveying that information in an appropriate manner. As a group, we realized that this data may be more difficult to visualize than we thought.
Certainly, there are pieces of information that we feel are more important to convey than others, but we also want to give an adequate representation of the data we’ve collected. It will be important for us to showcase that this is the information we’ve been given and there may be certain volunteer biases associated with it. Those clients who still have working phone numbers and are willing to divulge information may have improved their situation, but what about the 80% of those clients who have not provided feedback to our survey, a good portion of whose phones are no longer working?
With “big” data comes big responsibility, but, our group isn’t exactly amassing large quantities of data that we can computationally analyze to reveal patterns, trends, and associations. However, we recognize that it’s not always about what you see. The grand majority of our calls are going unanswered, despite repeated attempts at leaving voicemails with friendly messages and encouragement to contribute to the cause that we’re trying to support. Some phone numbers are out of service, which surfaces a larger question and issue: are evicted clients unable to afford mobile phones once they’ve moved? How many of them have changed their numbers and how many simply can’t afford to have phones after they’ve been evicted? What percentage of these inaccessible clients are homeless or have had to move far away from their original home and work? Big pieces of data are missing, which, to some extent, could compromise our project message.
There’s a general sentiment in engineering know as “fast-cheap-good”. This sentiment summarizes the traditional project management triangle that graphically represents an intersection relationship (i.e., project quality) between project cost, scope, and schedule. Usually, it is believed that you must pick at least one to, in some part, sacrifice because it is nearly impossible to succeed in all three. The point that we are getting at here is that we believe that the scope and schedule of Mapping the Effects of Silicon Valley are reasonable, but the theoretical “cost” is the deficit we are experiencing in data collection. Ideally, the clients would come to the clinics, have their settlement reached, and have their stories followed throughout the process of eviction and relocation. Albeit, a more arduous route to take, following through with the client on a frequent basis would provide a clearer story and humanize the effects of eviction. This approach, however, is close to impossible because it would likely take massive amounts of money and resources that CLSEPA and other involved parties simply do not have. As a result, the next challenge is to showcase the scope of what we have done for the project, the data we haven’t “collected,” and perhaps provide some recommendations to further this study and benefit CLSEPA and communities within the Bay Area.