The Internet, Web 2.0 and Social Networking technologies are enabling citizens to actively participate in “citizen science” projects by contributing data to scientific programs. However, the limited expertise of contributors can lead to poor quality or misleading data being submitted. Subsequently, the scientific community often perceive citizen science data as not worthy of being used in serious scientific research. In this paper, we describe a technological framework that combines data quality improvements and trust metrics to enhance the reliability of citizen science data. We describe how trust models provide a simple and effective mechanism for measuring the reliability of community-generated data. We also describe filtering services that remove untrustworthy data, and enable confident re-use of the data. The resulting services are evaluated in the context of the Coral Watch project which uses volunteers to collect data on coral reef bleaching. Published in 2010
Civic minded citizen scientists in your community help meteorologists and the National Weather Service stay abreast of inclement weather with on-the-ground data.
Earlier this week, the Midwest and Northeast were slammed with tornados and thunderstorms that grounded planes and held up trains. Thousands of people along the Northeast corridor lost power as a result.
During such hazardous weather, we rely on the knowledge, skill and expertise of meteorologists and designated emergency personnel to keep us safe and in the know. They in turn rely on data supplied by not just satellites and doppler radars but also – a network of citizen scientists.
But wait. With all our sophisticated technology, what could a few volunteers possibly contribute?
“Radars can tell us that there is heavy snowfall, but radars don’t tell us how much, or if rain is mixing with the snow, or what damage is occurring. Our spotters do,” explains Tanja Fransen Warning Coordination Meteorologist with the National Weather Service in Glasgow, Montana.
The ‘spotters’ she is referring to, also Skywarn’s ‘storm spotters’ are a national network of over 350,000 volunteers who work with their local emergency and weather centers to monitor and report inclement weather. Skywarn was a response to the Palm Sunday Tornado Outbreak a particularly devastating series of tornadoes that ripped through Midwestern states in 1965 Overseen by NOAA’s National Weather Service, the Skywarn program trains citizens to identify severe storms and provide accurate reports of storm developments and effects.
During a storm, volunteers send in reports to National Weather Service forecaster offices about what is happening locally. Meteorologists use this valuable ‘ground truth’ to validate data from their instruments and fill in information gaps, enabling them to make better predictions about what the storm might do next.
“Reports from our spotters can be the basis for issuing severe weather warnings. For the recent floods in Houston we received flooding reports from a variety of sources including Skywarn spotters,” says Dan Reilly, Warning Coordination Meteorologist with the National Weather Service in Houston-Galveston. The Fort Worth National Weather Service office estimated that those floods dropped about 35 trillion gallons of water .
Skywarn storm spotters are a diverse group of people varying in age, background and skill level. What they do have in common is an interest in weather and public service. To be a Skywarn storm spotter, volunteers must attend free training courses which cover the basics of storm formations, accurate reporting techniques and of course, storm safety. Last year alone, NOAAA trained over 70,000 storm spotters.
And with the ubiquity of social media, having a pool of trained volunteers is ever more important.
“These days it is so easy to send in a picture or phone in an event. Our offices can gather a lot of information. But it also increases the possibility of false reports,” says Fransen. “Now more than ever, it is critical to have trained personal who can critically think and evaluate what they are seeing. They are reliable resources, giving us information we can trust.”
In addition to on the ground storm spotters, the Skywarn network includes a subset of licensed amateur radio operators who provide additional assistance during storms. The National Weather Service forecast offices utilize amateur radio to maintain communication between on the ground storm spotter and forecasters. And during especially large storms which can knock out phone service, amateur radio volunteers help keep their communities informed of new warnings and other critical information.
But the contributions of Skywarn volunteers doesn’t stop when the storm ends. The National Climate Data Center archives all severe weather reports and the data is used by insurance companies, researchers and other government agencies. You can check out recent reports on this map or access the archived data.
If you want to help your community the next time a storm hits, NOAA now provides online training modules. So with just a few clicks you can be on your way to becoming a Skywarn storm spotter.
And the folks at NOAA will certainly appreciate the help of their volunteers.
“In the community I work with, we have a lot of repeat volunteers,” says Fransen. “It is really good to see how civic minded and dedicated our volunteers are.”
Talk to those in your community and find out if there is a storm spotter among you! Are you a storm spotter or training to be one? Tell us about your experience in the comments below!
Skywarn is featured as part of SciStarter’s newsletter about citizen science projects that help during times of crisis. Learn more about other featured projects in the newsletter including GeoTag-X, Did You Feel It?, Did You See It? and Monitor Change: Fire Monitoring, and sign up to receive cool citizen science projects curated by SciStarter in your inbox!
The post Did you know ‘storm spotters’ in your community keep you safe during severe weather? appeared first on CitizenSci.
We show how machine learning and inference can be harnessed to leverage the complementary strengths of humans and computational agents to solve crowdsourcing tasks. We construct a set of Bayesian predictive models from data and describe how the models operate within an overall crowdsourcing architecture that combines the eorts of people and machine vision on the task of classifying celestial bodies defined within a citizens’ science project named Galaxy Zoo. We show how learned probabilistic models can be used to fuse human and machine contributions and to predict the behaviors of workers. We employ multiple inferences in concert to guide decisions on hiring and routing workers to tasks so as to maximize the eciency of large-scale crowdsourcing processes based on expected utility.
This is a guest post Michael Bear Citizen Science Project Director at Ocean Sanctuaries. In this post, he describes a citizen science led effort to catalog marine life living in and around the HMCS Yukon which in 2000 was transformed into an artificial reef as part of San Diego’s marine conservation effort.
In 2000, the City of San Diego in collaboration with the San Diego Oceans Foundation (SDOF), purchased, cleaned and sank a 366 foot-long Canadian warship called the HMCS Yukon to create an artificial reef, a task at which has been spectacularly successful. Sitting at the bottom of the San Diego coast, the Yukon attracts dozens of local marine life species and is becoming a revenue-generating attraction for tourist divers from around the world.
When this project started, both the SDOF and the local scientific community were curious to understand the effects of an artificial reef on local fish populations and surrounding marine life. A joint study was undertaken by SDOF and Dr. Ed Parnell of Scripps Institution of Oceanography and released in 2004.¹ Crucial to the study was data gathered by local citizen science divers to generate a baseline of marine life species on the ship.
This year, Ocean Sanctuaries, San Diego’s first citizen science oriented, ocean non-profit is conducting a follow up study to the pioneering work of Dr. Parnell and colleagues. Established in 2014, Ocean Sanctuaries encourages and supports citizen science projects which empower local divers to gather marine data under scientific guidance and forwarding our understanding of the oceans. Ocean Sanctuaries currently has three active citizen science projects. ‘Sharks of San Diego’ and the ‘Sevengill Shark ID Project’ are both shark related. The third project is the follow-up study on the Yukon called the Yukon Marine Life Survey.
The data gathered in this project will be mainly photographic. Local divers will photograph specific areas of the ship in quadrats and with transect lines and the data will to be compared with the same areas examined in the 2004 study.
The project plans to use a web-based application for wildlife data management called ‘Wildbook’ for cataloging observations made in the Yukon Marine Life Survey. ‘Wildbook’ was originally designed to identify whale sharks, but will be modified as a multi-species database for use with the Yukon Marine Life Survey.²
Referring to the original Yukon Marine Life Survey of 20041, Barbara Lloyd, Founder of Ocean Sanctuaries says, “The Yukon Artificial Reef Monitoring Project (ARMP) was a short-term baseline study of fish transects and photo quadrats. The ARMP project has not been gathering data for about a decade now. We at Ocean Sanctuaries strongly believe that a follow up study to the original baseline study can provide the research and fishing communities with valuable marine life data. In addition, unlike the original study, we intend to use photographs to ensure verifiable encounter data. We aim to create a large base of citizen scientists to take the photos and enter the data. This crowd-sourced data will allow us to collaborate between citizens and researchers.”
The current Yukon Marine Life Survey will span at least five years. Once completed, the data will inform scientists of changes to the marine life on the ship enabling California coastal managers to evaluate the impact of artificial reefs on local marine species. Take a video tour of the Yukon and learn more about the project at SciStarter.
References: 1. Ecological Assessment of the HMCS Yukon Artificial Reef off San Diego, CA, Dr. Ed Parnell, 2004:
2. Wildbook: A Web-based Application for Wildlife Data Management
Find more posts like Citizen scientist divers help track the success of artificial reefs. by Editorial Team on the SciStarter Blog. Your source for citizen science and other science you can do.
Margaret Mead, the world-famous anthropologist said, “never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.”
The sentiment rings true for citizen science.
Yet, recent news in the citizen science world has been headlined “Most participants in citizen science projects give up almost immediately.” This was based on a study of participation in seven different projects within the crowdsourcing hub called Zooniverse. Most participants tried a project once, very briefly, and never returned.
What’s unusual about Zooniverse projects is not the high turnover of quitters. Rather, it’s unusual that even early quitters do some important work. That’s a cleverly designed project. An ethical principle of Zooniverse is to not waste people’s time. The crowdsourcing tasks are pivotal to advancing research. They cannot be accomplished by computer algorithms or machines. They require crowds of people, each chipping in a tiny bit. What is remarkable is that the quitters matter at all.
My grandfather used to cajole me into trying new food when I was a finicky youngster. “How do you know that you don’t like brussels sprouts? Try it, you might like it,” was his mantra. I would try it. I would hate it. Even though I quit brussels sprouts immediately, giving them a taste was important. Now I cook and eat them, and I while I don’t serve them to company, I can talk about how to cook them with other brussels sprout aficionados.
It is the trying, rather than the quitting, that is newsworthy. When I checked the website today, the Zooniverse had over a million participants (1,266,934 to be exact). Even if 73% are quitters (that’s the average quitter rate among the seven projects in the study), that leaves a core of 342,000 strong non-quitters.
What is even more interesting is that a core group of determined and dedicated people are the best citizen scientists. They are invaluable parts of participatory research projects. This is universally common (not only Zooniversely common). We see it spanning other styles of projects.
For example, the online project Foldit, where participants are gamers (or players), the goal is to solve three-dimensional puzzles of protein folding. Foldit encourages players to demonstrate their mental prowess by solving over 30 tutorial puzzles with known answers before they can put their minds to the real puzzles. Most gamers are weeded out before they actually enter Foldit citizen science.
At the other end of the spectrum are community-based projects. For example, Global Community Monitor assists neighborhood groups in monitoring pollution, often through the use of buckets brigade technology (that is, supplies from Home Depot for DIY monitoring). They recommend a core group of five to do the heavy lifting of the project, such as data collection, organization, and education of neighbors.
In a case that landed Mark Kamholz, Environmental Control Manager for Tonawanda Coke Corporation, with a conviction and one year in prison (currently serving), the core community was only four individuals. It began when these four citizen scientists – Jackie, Adele, Bob, and Tim – sampled the quality of air. These four could not see, but could smell, the pollutants in their Tonawanda, New York neighborhood. I don’t know whether they liked collecting data, but quitting wasn’t an option. Their own health depended on citizen science. Their data caught the attention of Al Carlacci with the New York Department of Environmental Conservation. He collected additional samples in order to triangulate on the pollution source. This was only the second time in the United States that a corporate employee was convicted in criminal, rather than civil, court for polluting (11 counts of violating the Clean Air Act, and more), and the first time the conviction resulted in jail time. (Tell me why news stories are focusing on citizen science quitters?)
Good citizen science design can mean that a core group does most of the work, while everyone benefits. It’s not like The Little Red Hen, where if you don’t help harvest the wheat, then you don’t deserve to get any bread. Participating is open to everyone, but that doesn’t mean everyone has do it. Nevertheless, the results are for everyone. Science, especially citizen science, is to improve society.
An Internet rule of thumb in that only 1% (or less) of users add new content to sites like Wikipedia. Citizen science appears to operate on this dynamic, except instead of a core group adding existing knowledge for the crowd to use, a core group is involved in making new knowledge for the crowd to use.
eBird, where the highest skilled birders contribute most of the data, is a great example, one that I’ve highlighted before. Researchers, managers, and other birdwatchers use the information which is easily accessible and visualized in maps.
“Know your audience” is the golden rule for public speaking and writing. It holds for designing a citizen science project.
Citizen science has a long tradition in the natural history fields because it is easy to tap those with existing hobbies. It is particularly helpful where hobbyists have built communities that foster their individual and collective expertise and skills. Such projects avoid many problems related to data quality and sustained participation. Good project design involves finding a good match with existing participant expertise and interest.
For example, consider distributed computing, which is another style of citizen science, in which participants donate their unused computer resources to computationally intensive research problems. In this case, fandom groups, who tend to be tech savvy, include promising communities of interest. The largest fandom group to contribute to citizen science so far are the Bronies. Bronies are typically young adult males (bros) who are fans of the animated cartoon show, My Little Pony. A herd of about 1,000 Bronies play in Brony@Home, a team frequently near the top of competitions in a suite of distributed computing projects such as Folding@Home, Rosetta@Home, and Wildlife@Home.
In citizen science, a crowd can be four or a crowd can be hundreds of thousands. A citizen scientist is not a person who will participate in any project. They are individuals – gamers, birders, stargazers, gardeners, weather bugs, hikers, naturalists, and more – with particular interests and motivations.
As my grandfather said, “Try it, you might like it.” It’s fabulous that millions are trying it. Sooner or later, when participants and projects find one another, a good match translates into a job well done.
The post Coop’s Citizen Sci Scoop: Try it, you might like it appeared first on CitizenSci.
Recently, I sat down and had a think about what I had seen in the past, as well as some of trends that I’ve been noticing. Today, I’m going to review some of those and also go out on a limb with some predictions as to where I see citizen science heading.
It’s Definitely a Thing, Now
In the last three or so years, I’ve noticed a sharp increase in the amount of mainstream interest in citizen science. Where it was once just the province of a smaller group of hardcore geeks (think: early adopters of the SETI@Home client), it now seems like everyone is talking about citizen science. Anecdotally, I’ve been interviewed by a fairly wide range of media outlets — everything from CBC Radio to Woman’s World. On the hard data side, this screen shot of the Google Trends entry on citizen science bears this out:
There’s More Variety Than Ever
Citizen science projects are busting out all over, so there’s now a really impressive range of both topics and types of projects. Whereas once your choice was between the Christmas Bird Count, deploying BOINC, or playing with images from Mars, now you can do everything from raising Monarch butterflies to being a paleontologist in your kitchen.
Record of the first bibliography can be traced back to the Ancient Library of Alexandria. The former Macedonian general Ptolemy I Soter, who was a successor to Alexander the Great, founded the library. The library itself would go on to become a renowned center of scholarship.
The website History of Information records that in approximately 200 BCE, Callimachus, the highly respected head of the library compiled a catalogue of its entire holdings. Called the Pinakes, which translates to tables or lists, he divided the authors into classes; arranged the authors in the classes or subdivisions alphabetically; added biographical information to the name of each author and listed titles of each authors work under their names etc.
With recent developments in citizen science the world over, a Pinakes for the field was inevitable. In a piece posted on the Extreme Citizen Science blog, Diana Mastracci writes that the team at Extreme Citizen Science, the UCL Interaction Center, and the University of Geneva, com Citizen Cyberlab,” says Cindy Regalado, a PhD candidate in the Department of Civil, Environmental and Geomatic Engineering at the University College London, referring to piled a list of scholarly resources (journals articles, books, web pages, magazine articles, etc.) considered to be most important to the study of “citizen science, creativity, learning and education in citizen science, and the evaluation of citizen science projects.”
“It was originally conceived as an idea for our project EU-FP7 an initiative which is exploring technology enhanced creative learning in the field of citizen cyberscience. “We knew that a large group of researchers and practitioners coming together would draw from various sources, both external and their own, so a shareable repository of articles made sense—shareable not only with the project team, but as a valuable resource for anyone interested in citizen science,” she says. As part of the project with Cyberlab the group proposed publishing an online annotated bibliography of relevant literature from citizen science, creativity and education at the end of the first year of the project.
They brainstormed the most important topics and themes in order to categorize the resources and create a series of tags that reflect the work, understanding and development of Citizen Cyberlab.
The main tags are: disciplinary domain (within e.g. science, humanities, etc.), methods (the procedures, approach, techniques, plan or arrangements used in the article/book) and purpose (review, critique, reflection, ethical considerations, evaluation, etc.). Regalado says the original intention was twofold, “Considering that there is a great number of papers and articles written on citizen science, we wanted to select the ones that were relevant for our project, have them in one searchable place, and share them with ourselves and world online.” It also serves as a way to categorize the vast number of papers, based on different categories relevant to the project. The main theme categories are citizen science, education, and creativity. The sub-themes in citizen science include typology, design, and evaluation of citizen science projects. The sub-themes in education and creativity include learning (informal, accidental, and online, etc.), games (serious games, leisure, gamification, etc.), definitions, and design (in, of and for citizen science).
Regalado adds, “When we published the bibliography we categorized the publications (including peer-reviewed, magazine articles, videos, etc.) according to the tags above, relevant to our project. Since then, more than 50 people, not associated with Cyberlab, have joined the collection titled Citizen Cyberlab: Learning & Creativity Aided by ICT on Mendeley, a popular bibliography management program and have uploaded articles they think are relevant.” And if the keywords and categories they created are not relevant to your project, you can create your own.
“Seeing how it has grown on its own since we published it is really exciting,” says Regalado. “It was publicized when it first came out but we haven’t promoted it since, which means that people have just found it through their own searches. People are adding articles to it, which means they value it and want to contribute to it, and by doing so they also make it their own.”
This being citizen science, the group is interested to learn what topics are important to you as a citizen scientist! Anyone can join the group, and add to the bibliography—which is quickly earning praise—so please, read up on how to join here! The bibliography can be found here.