12 May 2013
In case you missed it, the Globe ran a front page Metro story on our smartphone data collection last week. We have had an upsurge in participation in the last week, but still could use a few more participants. If you have an Android phone, please do consider taking 10 minutes to participate. And we will post any findings from the data on this blog as well as on Volunteer Science.
2 May 2013
My lab (primarily Yu-ru Lin) has created an interactive tweet map around the boston bombings, looking at the emotional content of the tweets of people who were near the finish line of the Marathon at around 2:50pm through the day. Below is a short clip just looking at the level of fear through the day (notably, it took 20-30 minutes for fearful tweets to go up).
Btw, we still need volunteers for our smartphone data collection, described above. I hope some readers will be game to participate.
28 April 2013
I am pleased to announce this upcoming seminar at Northeastern University:
Mark Newman (University of Michigan)
Wednesday, May 1st from 3:00 p.m. to 4:00 p.m.
Statistical inference, machine learning, and large-scale structure in networks
A fundamental problem in the study of networks is that of understanding their large-scale structure. Methods drawn from statistics and machine learning, especially maximum likelihood methods, can help. In this talk I'll describe, with accompanying examples, a suite of methods developed in recent years that can shed light on things like community structure, hierarchy, and status (or ranking) within networks. I'll also describe some intriguing connections between these methods and other areas in physics and engineering, including spin-glass theory, signal processing, and the fundamental limits of computation.
5th floor, Dana Building, 110 Forsyth St., Boston, MA 02115 (on GPS key in 100 Forsyth)
Enter main doors, turn left take the LARGE FREIGHT elevator to the 5th Floor.
Renaissance Garage #62, 835 Columbus Ave., Boston, MA 02120
Boston-Cambridge Colloquium on Complexity and Social Networks (BCCSN)
Co-sponsored by NULab and the Center for Complex Network Research (CCNR).
Please volunteer for our crowd-sourced study of communication around the Boston Marathon! We are launching an Android phone based data collection, similar to what we are doing around Hurricane Sandy, to understand the role that mobile phones played in communication during the day of the Marathon and aftermath. Our hope is that this will contribute to our understanding of crisis related communication; we will also donate $3 to The One Fund Boston for each person who completes the study. In order to participate, you need to have an Android phone, and to either have been in Boston or have some significant Boston connections.
Note: we will post any findings from this data collection on this blog, so you can follow what we discover from the data.
14 April 2013
This 10 minute presentation of mine on computational social science (using "big data" to understand social systems) at Harvard's CID might be of interest to some readers of this blog. It covers issues ranging from detecting emergencies in sociotechnical systems, to detecting "invisible" political networks from unstructured text:
A reminder-- if you are from the Northeast and have an Android phone, please participate in our study on communication behavior during Hurricane Sandy. We will be giving $3 to food banks in affected areas for every completed survey we receive.
1 April 2013
My lab, with the support of the NSF, is launching a crowd-sourced study of Hurricane Sandy, so as to better understand how people react in emergencies. If you were affected by Hurricane Sandy and use an Android phone, I hope you will be willing to help out. This will take 10-15 minutes of your time. And if you weren't, then I hope you can pass this post on to someone that was affected by Sandy.
How do people respond in large-scale emergency situations, like earthquakes and hurricanes? Understanding this should inform more effective responses to save lives and reduce hardships. Getting hard behavioral data in the moment and aftermath is difficult--because people have better things to do than to participate in a study. There is quite a bit of valuable research based on interviews after the fact, but such research necessarily relies on reconstructed memories of behavior.
There is another path--which is to study the data passively collected about people by the sociotechnical systems relied upon during emergencies. An outstanding example of this is the paper by Bagrow et al that examined behavior as captured by mobile phones during a set of emergencies. The power of this approach is that it offers hard behavioral data at massive scale. The shortcoming, however, is that it cannot contextualize (beyond geography) the data. Who, exactly, are people calling? Their spouses? Friends? What are they communicating--the need for help, reassurances that they are ok?
Here we are launching a study that sits between these two approaches. Essentially, we are asking people to load an app on their Android phones (iPhone users: sorry, but for now we could only afford to develop for one platform), and the app will ask about their situations during Hurricane Sandy, and look at their calling and texting behaviors, asking them about their relationships with those individuals. We will therefore get a precise record of behaviors before/during/after Hurricane Sandy, and contextualize within personalize circumstances and particular relationships.
My motivation here is scientific and personal. I think there is the possibility to do great science here that is potentially consequential for people's lives, that can inform interventions that will help people. And, having grown up on Long Island, and spent the early part of my career Red Bank, New Jersey --near the shore ("shaw")-- I could see a lot of suffering occur among my friends and family in the aftermath, where there was very little I could do. But this study is at least something good that I can make out of a terrible thing.
21 March 2013
Below is another example of Coburn-endorsed NSF funded political science (circa 2010). Feel free to post other examples. The question for NSF now, courtesy of our Congress: is there an economic/security benefit to improving our understanding of our democracy?
Sep 30 2010
New voters guide will heal rifts through voter education and dialogue
The new Living Voters Guide, a collaboration between Seattle's CityClub and some University of Washington faculty, attempts to better inform voters about November's state ballot issues and help them explore them through respectful dialogue
The Seattle Times (Washington) - by Lance Bennett, Alan Borning and Diane Douglas
A recently commissioned Seattle Times poll found that Washington voters are frustrated and divided on policy questions. They trust neither party. They are disheartened by toxic partisan rhetoric and don't know where to turn for wisdom on the significant decisions before them in the upcoming election.
We have joined forces to develop a new technology that is inspired by three goals: Restoring trust in our neighbors, learning to trust our community's wisdom and demonstrating trust in President Jefferson's claim that an informed citizenry is the bulwark of a democracy.
Most citizens see the deeply divisive political stalemate that results when we relegate the framing of political discourse and shaping of public opinion to talk-show hosts, partisans and lobbyists. To reclaim a citizen-centered democracy, to rebuild public trust and civil discourse, we're going to have to do it ourselves at the grass roots.
The Internet and the technologies that connect people through it can provide access to vast and immediate information resources, diverse perspectives and person-to-person communication.
The newly released report, "2010 Civic Health in America," finds that the Internet also can be a boon to civic engagement. Residents of "Internet households" in America have a voting rate about 19 percent higher than that of non-Internet households, and those who go online on a regular basis are more likely to be involved in offline community activities as well.
At the same time, the Internet is hardly a panacea for creating a more civil society -- indeed, much of the current online discussion about political matters is anything but civil. It is essential to design new technologies that help foster deliberation and respect while still maintaining vigorous debate and free speech.
CityClub, University of Washington's Center for Communication & Civic Engagement and its Department of Computer Science and Engineering collaborated to produce a new Web-based resource to advance digital democracy in Washington state. With funding from the National Science Foundation, we developed an online resource to promote community discourse and deliberation on the nine critical ballot measures before Washington voters this November.
Our Living Voters Guide invites all Washingtonians to discuss these vital ballot measures together, to explore one another's positions, and to build a personal, customized platform that will inform their final vote.
This voters guide is co-created by everyone who participates. It evolves as neighbors across our state consider the trade-offs for each measure. It requires participants to pledge that they will not make personal attacks on others but focus on the issues before us.
It invites everyone to wrestle with both the pros and cons of the ballot measures in a deliberative path toward decision making.
The language of Initiatives 1100 and 1105 is difficult to distinguish. Resolution 8225, proposing a constitutional amendment, is very technical. Several of the November measures, especially Initiative 1098 proposing a top-earner income tax, would alter the structure of taxation and financial support for public services in our state.
The decisions we make on all these ballot measures are crucial. We will all feel their consequences immediately and for a long time. So will our children.
That's why it is imperative that we consider them carefully, with due deliberation and with the benefit of community wisdom in a forum that is nuanced, pluralistic and collaborative. We need to come together as citizens to explore our electoral choices -- without accusations, rancor and acrimony -- knowing that we're all going to share the profit and loss generated by our collective decisions on Nov. 2.
We hope the Living Voters Guide will help build a connected and informed electorate. We hope it will inspire public trust in one another. We offer it as our own ballot initiative to reclaim citizens' power and shared responsibility for making our democracy work.
Lance Bennett is the director of the Center for Communication & Civic Engagement, University of Washington. Alan Borning is a professor of computer science and engineering at the UW. Diane Douglas is executive director of CityClub.
20 March 2013
The Senate approved a 2013 spending bill today. Little noticed by most was an amendment tucked in, approved by voice vote, to defund political science from the National Science Foundation. This was a follow on to Senator Coburn's 2009 effort, that included critiques of a research project of mine. Ironically, and notably, Senator Coburn also approvingly cited our research on Congressional websites in a press release the following year (quoted in full below)--presumably because it was critical of the lack of policy substance available on his colleagues' websites.
This was a small bit of a much larger project of ours, but the point here is that we actually did provide an insight that is indeed consequential--that as citizens we want representatives who make clear where they stand on the issues, and that Members of Congress were failing to do so with the most effective mechanism they had. Senator Coburn took the knowledge that we produced to inform public discourse, and to shame his colleagues to do the right thing.
In short, Senator Coburn's own actions (in 2010) highlight the value of knowledge over ignorance. Given the current dysfunctions of US government, surely we need more knowledge about politics, government, and governance now not less. In contrast, this amendment--not yet law-- represents a vote for darkness over light, a triumph of ideological certainty uninformed by evidence.
Sep 14 2010
Congressional websites muddy stands on issues
News@Northeastern - by Jason Kornwitz
Congressional websites obscure lawmakers' policy preferences, and lack input from constituents, according to a new study on the Internet's impact on politics conducted by Northeastern University professor David Lazer and his colleagues.
The researchers interviewed 100 congressional staff members who oversaw their office's websites in 2006, and analyzed all House and Senate websites based on criteria developed in collaboration with the Congressional Management Foundation, a nonpartisan nonprofit dedicated to improving Congress. Read the study here.
The National Science Foundation funded the research, as part of its "Connecting to Congress" project.
Lazer and his colleagues found that many congressional websites don't identify where a politician stands on hot button issues such as abortion, gay marriage and health care, and go so far as to exclude lawmakers' party affiliations.
It's a tactic that promotes political survival, but fails to uphold democratic values, said Lazer.
Lazer, an associate professor of political science and computer science, noted that voters often end up electing candidates without knowing their true positions on critical issues.
He acknowledged that legislators might tailor their messages to particular audiences on Facebook or Twitter, but explained, "The Internet often does not allow for targeting messages to micro segments of your audience. So, if you're going to post stuff that wins more votes, rather than loses votes, it has to be bland."
The study also found that the general public is rarely asked what features they like to see on their representatives' websites, whether through online surveys or focus groups.
It's a troubling sign for Lazer, who said communication between legislators and constituents is key to the health of our democracy.
"One would hope that the Internet would facilitate a robust discourse between representatives and citizens, and that the official websites would be an opportunity for representatives to spur and engage in that discussion," said Lazer. "But we're not really seeing that."
Lazer's coauthors on the paper, titled "Improving Congressional Websites," included Kevin Esterling, an associate professor of political science at the University of California--Riverside, and Michael Neblo, an assistant professor of political science at Ohio State University.
12 February 2013
For those of you attending AAAS in Boston this weekend, this panel might be of interest:
The Science of Politics
Friday, February 15, 2013: 1:30 PM-4:30 PM
Ballroom A (Hynes Convention Center)
"Politics" is an elusive phenomenon, with popular perception focusing on the importance of factors that do not seem subject to scientific inquiry; perhaps this is why National Science Foundation funding of the discipline has been under attack in Congress. However, even the founding figures of the United States viewed politics as if they were governed by logical processes. This panel focuses on emerging approaches within the discipline, with a focus on methods and ideas that have crossed over from other sciences, from molecular (genetic) analyses to international institutional determinants of political outcomes. The modern science of politics has revealed the substantial structure of political behavior and how institutions are shaped by and shape political behavior. The methods presented include field experimental work on political behavior, game theoretic approaches to politics, genetic foundations for political behavior, and network science-based approaches to political science.
David Lazer, Northeastern University
Barbara Jasny, AAAS/Science
Barbara Jasny, AAAS/Science
Donald Green, Yale University
Field Experiments in Political Science: An Overview of Advances
Susan Hyde, Yale University
The Diffusion of Democratic Norms
David Lazer, Northeastern University
Network Science Meets Political Science
Rose McDermott, Brown University
Biological Influences on Political Outcomes
Daniel Diermeier, Northwestern University
Modeling Politics: Promise and Limits of Formal Models in Political Science
By David Lazer | 9:27 PM
12 November 2012
Let me offer a short reflection on the apparent triumph of the professional pollsters over their critics in 2012. To briefly recap, in the weeks leading up to the election, there was a fairly energetic critique from the right that there was a systematic bias of mainstream media surveys to over-represent Democratic constituencies in their samples. This critique seemed to be authentically held--not simply spin to keep hope alive for prospective Republican voters-- as reflected by the stunned reactions of commentators on Fox, as well as the apparent surprise of the Romney campaign of the outcome. (As one Romney advisor stated, "I don't think there was one person who saw this coming.")
There are two striking things to observe about this moment. The first is how good a job professional pollsters did, and the second is how robust the social consensus was on the right that Romney was going to win.
First, on average, professional pollsters were remarkably on target (if just slightly biased against Obama)--ultimately both at the state level and nationally. I should note that this was a particularly predictable election--if one had simply said the 2012 map would look exactly the same as 2008, you would have a had hit rate of 96%, and if you had expected that Obama would do a bit worse than 2008 (say, subtract 2 points across the board), 100%. At another level, this pattern is irrelevant--since this was not information that fed into the polls. What is important is how well pollsters did in the face of increased obstacles to doing a good job: response rates to surveys have plummeted, and increasing numbers of individuals rely exclusively on (hard to reach) mobile phones. Despite these challenges, in aggregate surveys are more accurate than ever, almost spot on in 2012.
How is this possible? This is worth far more reflection than a blog entry can offer, because not all communities face challenges like these so effectively. How does this community channel inherently flawed human judgments in a fashion that they are, on average, right? There are surely lessons to be learned about the construction of knowledge and professional practices in a way that has turned out to be quite functional collectively. Here I will simply speculate that it reflects three things. The first is that there is real world feedback as to the effectiveness of methods to address these challenges. This learning, in turn, becomes embodied in a set of practices, which will, in turn be incrementally changed by future experiences. Further, none of these challenges have come abruptly, allowing an iterative process of learning how to adapt and projection of lessons to future surveys where these problems would have gotten a bit worse. Second, there is a natural competition among survey firms to be accurate, and thus the motivation to take these lessons to heart. This is a decentralized process of both Darwinian selection and purposive adaptation. I strongly suspect that there will be serious re-evaluation of likely voter models at Gallup, for example, which had a notably poor performance for the second election in a row. Third, there is a collective process of sifting through best practices. While there is certainly some desire to keep the secrets to success private, in fact there is a certain necessary degree of transparency in methods; and this is a small world of professional friendships where knowledge is semi-permeable, allowing a certain degree of local innovation providing short run advantage, while allowing good practices to disseminate. That is, there may be (as I have written about elsewhere) a good balance between exploration (development of new solutions) and exploitation (taking advantage of what is known to work) in this system.
The system of pollsters might be contrasted with that of pundits. Do you expect a Darwinian culling of the right leaning pundits who missed the outcome? The answer is surely not. Nor will there be an adjustment of practices on the part of pundits who largely served up a mix of anecdotal pablum to their readers.
All of this to not to say that there might not come a time where the community of pollsters converges on the wrong answers, or that the challenges in the future will be such that there are no good answers. The data, however, do not suggest that such a moment is imminent.
And how did the right get it so wrong? How could the Romney campaign of successful political professionals, in part embedded in the same epistemic community as the broader set of pollsters, not have seen an Obama victory as a plausible (put aside likely) outcome? This was not a near miss on their part. Consider: at last count, you could have subtracted 4.7 points (!) from Obama's margin in every state and he would still have won (the electoral college, not the popular vote). Romney's campaign, and many commentators on the right, were living in a parallel world, one with fewer minority and young voters than in ours. Again, I don't know the answer to this question. Likely key ingredients: an authentic ambiguity in how to handle the aforementioned challenges; a strong desire to see a Romney victory; an informational ecosystem today that provides the opportunity for producing plausible sounding arguments to rationalize any wishful thoughts one might have; and the relevant subcommunity was small, centralized, and deferential enough so that a few opinion leaders could trigger a bandwagon. The result was a madness of the crowd, as Mackay coined 180 years ago. The consensus in the Romney campaign may also have reflected the certainty of the candidate of victory (as reflected in the apparent lack of a drafting of a concession speech), which may have discouraged the articulation of any dissenting perspectives regarding the state of the campaign.
By David Lazer | 7:43 PM