Open Data for Development in Latin America and the Caribbean » Open Data

Get involved

Open Data

OD4D presents mapping of initiatives Government Open Data in Latin America

 

 

Elisa Calza, OD4D coordinator for ECLAC, resented the objectives and first results of the OD4D initiative during a panel on Open Data, as part of the II and IV Ministerial Meeting of National Electronic Government Network of Latin America and the Caribbean (Red Gealc) (see presentation). In particular, the focus was on presenting the progress of the mapping document of the Government Open Data initiatives in Latin America and the Caribbean.

This mapping is part of the first results of OD4D and seeks to offer a qualitative picture of data opening strategies in the region: as a photo, it offers a visual of the state of the Open Government Data initiatives developed or under implementation in different countries. Through the map, the reports for 12 countries in Latin America (currently available only in Spanish) can be accessed. Future reports will include the English-speaking Caribbean countries too.

The panel counted with the participation of Nagore de los Rios  (Director of Open Government and Internet Communication of the Basque Government), Felipe Heusser (President of the Fundación Ciudadano Inteligente, Chile) and Joao Battista Ferri (Director of Electronic Government in Brazil), which highlighted features and the growing role of Open Government Data in the respective Open Government and e-government initiatives.

 

 

OD4D participates in the II Ministerial Meeting and IV of National Electronic Government Network of Latin America and the Caribbean (Red Gealc)

The II Ministerial Meeting and IV of National Electronic Government Network of Latin America and the Caribbean (Red Gealc) is taking place in San José (Costa Rica) during November 26 and 27 2012. The meeting involves both government ministers and e-gov managers representing 29 countries in Latin America and the Caribbean, and various international experts, with the aim of promoting dialogue and sharing results, good practices and efforts on electronic government issues in the region.

The first day (November 26), highlighted the role that e-government has played and continues to play in the transformation of public administration, favoring greater effectiveness, as well as its importance in facilitating citizen participation, a phenomenon intended to increase significantly due to the rise in interoperability of activities and services.

An ample space has been devoted to the emerging challenges for e-government, the role of social networks, cloud computing and mobile government (m-gov). Mechanisms for monitoring progress in e-government issues were discussed, reviewing the indicators and rankings proposed by DPADM of UNDESA and the World Economic Forum.

The results of some researches on e-government-related issued will be presented on the second day of the event, including the research “Technical and economic factors that facilitate or inhibit the implementation of open data in the region”, led by Mauricio del Solar, form  the Universidad Técnica Federico Santa Maria (USM) of Chile.

Some panels will also address various emerging issues for e-government, such as open data and personal data privacy. Elisa Calza, OD4D coordinator for ECLAC, will participate on a panel on open data, which can be followed via live streaming from 11 am (Costa Rica time).

 

 

Event “Ciudadanía y Gobierno Digital C+G” in Chile

On November 20, 2013 the conference entitled, “Ciudadanía y Gobierno Digital C+G,” took place in Santiago, Chile, and was organized by the Unit for Digital Government Modernization and Digital Government of the Ministry of General Secretariat of the Presidency of Chile. For the second consecutive year, this initiative served as a space for practitioners to debate issues pertaining to digital development in Chile as well as global trends in this area, in order to identify new patterns of communication and participation between institutions and citizens.

Ciudadanía y Gobierno Digital C+G” focused mainly on government and open data as key elements to promote citizen participation and transparency. Two parallel workshops were dedicated to “Social networks for citizens” and “Challenges of Open Data in government”, including both national and international experts. In the Open Data workshop, representatives from Junar presented the work being done in providing technical support to the opening of government data catalogs in the region. Some relevant Chilean open data initiatives were also debated, such as open linked data from the Library of the National Congress of Chile and the opening of catalog data of Peñalolén. It also highlighted the potential for citizen empowerment of experiences in health field of open data journalism, as it is the case for Poderopedioa.

A hackathon entitled, C + G “Codeando por Chile,” was held the weekend before the meeting, the first citizen applications contest based on public data promoted in Chile by the public sector, in order to encourage its use for the development of applications as a service to the community, to facilitate their access to useful information in innovative ways in key areas such as education, health, transport and security, among others.

Open Data is Hot Topic at the W3C Brazil Conference

The city of São Paulo hosted on October 18-20 the 4th Web.br Conferece – an event promoted by the W3C Brazil office to debate the future of the Web – and Open Data was one of the hot topics debated.

According to the manager of W3C Brazil, Vagner Diniz, debating Data opening is paramount for the Web’s progress. Hence, the topic was part of several programming activities, such as panels, lectures, coffee break chats, as well as during the hackathon:

“There is an ever increasing number of devices capable of connecting to the Internet. The connection of several different types of devices to the Internet only makes sense if such devices can communicate with each other – i.e. if they can exchange information between them -, so that this data sharing enables better use of each of the devices connected. When we talk about an open Web, which was the theme of this Conference, we are talking about a Web that comprises these devices connected. And by talking about a Web that comprises these devices connected, we are referring to Open Data. For it is paramount to have data capable of trafficking from one place to the other seamlessly or data that enables me from my device to access data in a different device, thus enriching my Web experience”.

With a full room at 9 a.m., Jeanne Holm’s (Data.Gov evangelizer and Chief Systems Architect at NASA’s Jet Propulsion Laboratory) talk presented the U.S. experience opening its data and its impact on citizens’ lives. In an exclusive interview, Jeanne Holm said that the U.S. Government’s focus in regards to this topic is on how to provide more data, information and services to citizens, so as to enable them to make better decisions daily.

According to her, the government’s Open Data initiative involves 180 agencies, which have already provided access to 400 thousand databases.

“What is interesting about this is that when developers come together, such as in an event like this Conference today, they get their hands on these data and create applications or websites, or data journalists analyze them and help understanding what those data mean.”

Here you can watch the whole interview:

Another highlight of the Conference was an announcement by the Ministry of Justice confirming its first publication of data on the website dados.gov.br. According to Francisco Carvalheira, Coordinator of the Ministry of Justice’s Transparency and Access to Information Program, the institution decided to open its database of customer complaints received through Procons (Customer Protection Agencies) across the country. A “substantiated complaint” is an administrative procedure foreseen by the Customer Protection Code that represents 15% of the complaints registered by Procons.

“We believe that society will be able to come up with potential uses for this database. We believe that by publishing this database in open format we’ll be contributing to the actual Customer Protection public policy.”

The announcement was made by the Ministry of Justice during the Panel: “How to make the most of the Access to Information Act“. During this presentation, Francisco Carvalheira told that the institution has so far received 2,047 requests to access the information.

Also, in order to ensure the practical aspect of the debates, the Web.br created a space for journalists, programmers and webdesigners to work together in existing databases to produce information. During the Decoders hackathon, Open Data cases were presented and application templates were created using public data bases.

Zeno Rocha is a developer and he tells that him and a friend created a game specially to be presented at Decoders and motivate participating developers. According to him,  is an application aimed at providing young Facebook users information on politicians in a fun way.

The developers Kako and Rafael, on the other hand, saw their “Transpolitica” project win the Hackthon. This was the first time they both worked with Open Data.

Like Hurricane Maps? Thank Open Government Data Nerds | Techcrunch

Article originally published in Techcrunch.com

As Hurricane-battered East Coasters turn to online crisis maps for weather updates and evacuation notices, we should all take a moment to give kudos to the spreadsheet nerds who advocated opening up the very government data reserves that now fuel these online tools.

From Google’s hurricane hub to the The New York Times evacuation map, life-saving online tools draw from a recent and relatively underfunded set of government programs that release information in ways conducive to third-party developers. “Open data is critical in crisis situations because it allows government to inform and serve more people than it ever could on its own through conventional channels. By making data freely available in a usable format for civic-minded developers and technology platforms, government can exponentially scale its communications and service delivery,” New York City’s Chief Digital Officer, Rachel Haot, writes to TechCrunch in an email (hopefully from a safe place).

The small but tenacious open data movement is based on a faith that citizens can build amazing, yet unknown tools with the vast reams of data warehoused on government servers. “We are enabling entrepreneurs and innovators across all walks of life to tap into fields of data sitting in the vaults of government in machine-readable form,” said Todd Park, President Obama’s Senior technology advisor. They’ll “create all kinds of services and products that we can only even barely imagine.”

It was President Reagan that originally released Global Positioning System (GPS) data in response to a downed airliner that accidentally wandered into Soviet territory; yet he never could have foreseen that GPS would eventually power an entire industry and smartphone and automobile navigation products.

In between national crises, open data advocates are relegated to the lowest totem poll of government priority. Afterall, in the midst of a crippling recession and ongoing trillion-dollar foreign wars, paying the salaries of programmers to transfer private data onto public spreadsheets is a tempting program to put on the chopping block (and is therefore constantly under defunding threats). When open data is attached to partisan lightning rodes like healthcare, it can evoke the wrath of small government pundits.

But thanks to their faith in the power of liberated data, East Coasters are a bit safer (and the rest of the world has cool products like Google Maps). So the next time you read a story about a programmer ferociously demanding open data for some seemingly obscure government service, like about parking meters, comment at the bottom of the article with a simple “thanks.” You never know how the fruit of his labor will affect you or your loved ones.

 

Web.br Conference debate on Open Data

The 4th Web.br Conference began this Thursday, in São Paulo. The aim of the event is to uncover new paths and steer the debate on the future of the Web; highlighting its most relevant topics and discussing how to universalize it according to W3C’s principles: Web for all, from any device, anywhere and in any language or culture.

Brazilian and foreign experts will talk about HTML and CSS3, Web Accessibility for disable people, Semantic Web and Data Viewing. During the three-day Conference, the Open Data movement will also be discussed, as well as its impacts on the Access to Information Act.

On Saturday, the Conference will give way to the Decoders W3C Brasil, a collaborative event with hackers (hackathon). “It doesn’t matter whether you are a journalist, designer, sysadmin or gardener; All that matters is that you are willing and have a laptop to spend the afternoon hacking on “Open Data”.

Hence, the OD4D will monitor the Web.br and bring you all the relevant discussions on Open Data, as well as cases presented and, of course, whatever is created during the Decoders.
Do not miss it!

Follow the event live and our coverage on: , SoundCloud and .

“Most of the data stored by governments is not translated into information or services to the population”

Interview originally published in Blog Públicos – Estado de São Paulo

“Governments are not really aware of the amount and nature of the data they have stored. When they do have a rough idea, they lack the time to consider how that data can be applied and converted into services for the population.”

The general manager of the W3C consortium in Brazil, an international community of 300 private and state enterprises and universities that work together to develop Web standards, Vagner Diniz maintains in his interview to Públicos that governments must allow civil society to decide which public data are of interest to the population. He also believed that both parties must join forces to make the data supply meet the demand for information.

“We cannot just sit around waiting for the government to publish information, wasting money on data that might not even be of interest to the population. We will try to identify which data can be actually useful, create a demand for it and reach an agreement with government bodies to come up with a framework of priorities,” he says.
According to Diniz, civil society can spot possibilities in the data that are overlooked by governments. “Two hundred million people will see much more than 4 or 5 million civil servants.”

Why is it important for governments to publish their data in open formats?
The amount of data gathered and not used by governments ends up creating a useless mass of information. Governments use only the portion of the data that they need for administrative purposes. Most of them are not translated into information or services for the population. Governments are not really aware of the amount and nature of the data they have stored. When they do have a rough idea, they lack the time to consider how that data can be applied and converted into services to the population.

How important is this information to civil society?

What’s most important in making this information available is allowing the population itself to say: “This set of data might interest me, it is useful to me. Let me use it because I’ll be able to come up with scenarios in which it is relevant, while you as government have too many other concerns that prevent you from seeing what I can see.” In other words, it’s the idea that two hundred million people will see much more than 4 or 5 million civil servants. With governments worldwide starting to open their data, organizations, communities, interested individuals, Web programmers and volunteers have created interesting application software to make use of the data available.

What about to governments?
Curiously, this has generated an exchange of data within governments themselves. Different government bodies now have access to information from other bodies, which was previously very difficult to obtain due to endless bureaucratic processes.

This will undoubtedly contribute to greater government efficiency. But how can we guarantee that the immense supply of data stored by governments will meet society’s demand for information?
That is a tough task which I do not expect to see easily accomplished. Reaching an ideal stage of free-flowing information from government to society will be a hard process. It will involve raising awareness. There is a lot of resistance to publishing public data because the government sees itself much more as a proprietor than a custodian of that data. Public data are public, they belong to the population, and governments are custodians of data, but they act like proprietors. They fear what will be done to “their” data. A second effort involves qualification, as publishing these data in open formats demands a certain degree of technical expertise. We have to study the technologies that allow data to be openly published on the Internet. We must train people to do this.

Now…
…lastly, there must be an open and frank dialogue between the custodians of the data, the government bodies, and those interested in having access to the data, civil society organizations and many private citizens. We will try to address priorities. We cannot just sit around waiting for the government to publish information, wasting money on data that might not even be of interest to the population. We will try to identify which data can be actually useful, create a demand for it and reach an agreement with government bodies to come up with a framework of priorities.”

You once mentioned that developing application software is much easier than gathering consistent data. Could you explain this?
Developing an application based on data available merely involves creating a code which any slightly experienced web developer can read and freely apply to his own application. It is quite simple, much like creating a Web page. You don’t even have to be a Web developer to create a Web page nowadays, thanks to the tools available. Publishing data in an open format is more complicated, given that you, as the custodian of that data, have many other concerns besides the technical aspect of making the data available. It’s about more than that…

Yes…
…you have to make sure that the data is consistent. There cannot be another dataset with information that clashes with the data being published. You will publish three, four, ten databases, and any similar information they contain cannot be inconsistent. Secondly, there are security issues you need to worry about. You cannot allow the person who will use the data to alter them in any way. Thirdly, the data being published must be certified. Because if someone happens to misuse these data and alter them in any way, and then claim to have obtained the information from a government website, you, as the publisher, can prove that the original data were altered by that person. So there are many aspects to be considered when making information available.

Can you give an interesting example of data inconsistency?
I had an experience as IT director of a city in the state of São Paulo. A typical case was the city’s streets register. Each city hall department had its own register, with data boxes tailored to the needs of each department. The finance department’s register was geared towards collecting property tax, while the register of the public roads department focused on road works. The legal department was more focused on executing outstanding debts, and so forth. I counted six or seven registers. All of them had different information about the same streets. Even worse, the street names also differed among the registers, with different abbreviations. You never knew if a street in one register was the same as in another. It was also impossible to unify these registers, as they had different formats. This poses a serious problem when the information is made available, as different registers show the same information in different ways.

This reveals not only the size of the problem, but also the growing need to standardize government information.
Absolutely. This has been critical since the adoption of information technology in the organization of corporations. The need for standardization goes way back. Professionals in the area joke that the purpose of information technology is not to help you get better organized, but to help you make the same blunders you used to do without it (laughs). When you computerize an environment without altering processes and standardizing information, you will just do the same things you did before, but more quickly.


Can the private sector benefit from open data? If so, how?

I believe so, although the private sector has not yet realized this. It can benefit greatly in many areas of the open data value chain, especially technology businesses. One example is publishing open data on the Web. Moreover, creative and innovative businesses will scrutinize the open data carefully and be able to find ways to reuse and transform these data into commercially valuable services.

Can you give an example?
Nowadays, the IBGE Census is a rich source of information. It contains a lot of data on the country, the citizens, their distribution and characteristics. If these data are made available they can be extremely useful, albeit ensuring the right to confidentiality of personal data. Based on them you could, for example, offer consultancy services for new businesses, basing it on socioeconomic profiles; and you could also give advice on which businesses are in demand based on household profiles. There is another example in operation in Brazil called Gas Finder, an application for mobile phones which allows users to locate nearby gas stations. It is extremely useful and was developed using data available on the website of the National Oil Agency. You don’t necessarily have to generate income by charging the customer directly; income may be generated from ads displayed with the information. All it takes is entrepreneurship and creativity.

The 5 stars of Open Data

When we talk about Open Datastrategies that are farther reaching than publishing information, we may introduce the concept of Linked Data into the debate or go even further: Linked Open Data (LOD).

In the words of Tim Berners-Lee, the inventor of the World Wide Web, “Linked Open Data is Linked Data which is released under an open license”. Linked Data does not always have to be open. However, Linked Open Data does. Linked Open Data may only be referred to as such if it is open. And, aiming to promote this type of data, Tim Berners-Lee suggests a 5-star rating system.

This rating system awards a star to initiatives that make information publicly available in open format. More stars are awarded progressively based on how open and accessible the data analyzed is:

? Available on the Internet (in any format – e.g. PDF), provided that under an open license, to be Open Data

?? Available on the Internet as machine-readable structured data (in an Excel file with an XLS extension)

??? Available on the Internet as machine-readable structured data and in a non-proprietary format (CSV instead of Excel)

???? All of the above and it must use W3C open standards (RDF and SPARQL): use URL to identify things, so that people can point at their publications.

????? All of the above plus: link your data to other people’s data to provide context.

 

We have reproduced below a list of the benefits of publishing data according to the 5-star rating system, both for publishers and consumers:

 

Benefits of the 5-star rating

Rating

Consumer

Publisher

?

  • you can see data
  • you can print it
  • you can store it (e.g. in your hard drive or in a memory stick)
  • you can change data as you wish
  • you can access the data from any system
  • you can share the data with anyone

 

  • publishing is simple
  • you don’t need to keep repeating that people are allowed to use the data

 

??

  • Same benefits as for one star rating
  • Proprietary software can be used to process, aggregate, calculate and view data. Data may be exported in any structured format.
  • publishing is easy

???

  • Same benefits as for two-star rating- You are able to handle data as you wish, without having to use particular software.
  • publishing is even easier

????

  • Same benefits as for three-star rating
  • you are able to leave markings
  • you are able to reuse part of the data
  • you are able to reuse existing tools and data libraries, even if these are only partially compliant with the standards used by the publisher
  • you can combine data with other data.
  • you have control over data items and you can optimize access to it
  • Other publishers may link to your data, promoting it to 5 stars

?????

  • you can uncover more linked data whilst consuming data
  • you can learn about the 5-star rating
  • you make your data easier to find
  • you add value to your data
  • your organization enjoys the same benefits of linking data as consumers

 

 

Linked Data and Open Data

Linked Data and Open Data. These terms sound similar, but refer to different concepts. In fact, Linked Data complement the Open Data movement. Ideally, the two would go hand in hand.

Linked Data is the next development from the concept of Open Data, and it requires the latter to exist. While the concept of Open Data refers to publishing information and ensuring universal access to it, the concept of Linked Data refers to connecting these data to other sets of data. Together, these two movements not only make documents available, but also provide related information that explains and describes the content, its meanings and the relationship between the data shown.

An example of Linked Data is DBPedia, which extracts information from Wikipedia and makes it available through free licenses (Creative Commons Attribution-ShareAlike 3.0 License and GNU Free Documentation License), in addition to attaching other datasets found on the Web to Wikipedia data.

Along the same lines there is also GeoNames, a free geographic database, accessible under a Creative Commons license, which makes in excess of 10 million names available.

Hence, it may be said that Open Data and Linked Data walk hand in hand towards the development of the Semantic Web, which represents “large scale integration of data available on the Web. According to Tim Berners-Lee, the creator of the World Wide Web:

“The Semantic Web isn’t just about putting data on the web. It is about making links, so that a person or machine can explore the web of data. With linked data, when you have some of it, you can find other, related, data”.

Watch Tim Berners-Lee’s TED Talk on the next web:

How to Open?

In order to be regarded as open, public data must be comprehensive, accessible, primary (no statistical treatment), current, machine readable, non-discriminatory (e.g. not requiring registration), non-proprietary and its licenses must ensure such principles without limiting its freedom of use.

Several publicly available data are not really open. They may have been published in proprietary formats – i.e. not readable by software – and with restrictive licenses; they may be available in HTML tables, plain text files or PDF. Developers must, therefore, translate these data, cross-reference them and publish them according to the rules and principles set forth.

Institutions that wish to open their data must prepare an activities plan. This task includes from determining which data will be published to how it’ll be published and viewed, to strategies to promote the use of such data by communities and activists.

The international movement for government data opening is based on 3 laws proposed by David Eaves:

  • If data can’t be spidered or indexed, it doesn’t exist.
  • If it isn’t available in open and machine readable format, it can’t engage.
  • If a legal framework doesn’t allow it to be repurposed, it doesn’t empower.

 

In order words, the first step towards opening data is identifying the information controlled by governments, companies, etc. Then it must be converted into a machine readable format and, finally, made accessible to all.

We have listed a series of documents below which may be used as guidelines by governments, developers and others interested in data opening processes. Check out: