Back to latest

Montreal Python Conference (PyCon) Sprint: What We Worked On

We (Sarah, Ana, Tim and Michael) met at PyCon in Montreal (April 14-17) to lead the Open Contracting sprint in the Open Spaces rooms. There were many interesting discussions held over the four days of sprints from topics on data ontologies to public procurement phases.

This blog post covers the ways in which we applied what we discussed in the previous blog post.

PyCon Sprint Teams

We divided the project work planned over the 4 day sprint by forming 3 main teams:

  1. Python and machine learning team (Developer Documents)
  2. User case studies team – demand side research (User Stories Tool)
  3. Data field landscape team – supply side research (Hackpad Notes)

Sarah led the coding/Python team. There were many developers interested in collaborating their Python skills for technical contributions to Contracting Data Comparison (CDC). Two of the focus areas included machine learning and CDC functionality. Contributors included Aaron Cohen, Andreas Dewes, Simon Frid, Michael Glaude, Bryan Gorges, Brittain Hard, Brantley Harris, Matt Lamberti, Dan Langer, Teddy McWilliams, Aaron Rothenberg, David Street, Nick Zaccardi.

Tim led the user case studies team, building the User Stories Tool for participants to share their open contracting stories, available on the Open Contracting Data Standard page. Discussions were held with John Jordan and Louis Charbonneau, as well as with other developers at PyCon.

Ana led the data field landscape team. We found many researchers and open government enthusiasts interested in data ontologies, standards and analysis of open contracting. The team: Eric Canen, John Jordan, Herb Lainchbury and Noé Domínguez Porras.

The Data Field Landscape Team

As we are in the initial phases of the open contracting data standard, the focus for the sprint was on Goal 1 – Building a Datamap: Provide a landscape of available data. For this goal, a deep analysis of the datasets and fields within the datasets was required. The question we focused on was the following:

What fields should be in a standard is a question for our demand side analysis (asking users what they want and need). But when we ask that question it is important to be informed by what is already readily available – so what data is currently available from contracting portals?

When answering this question, we should keep in mind what users actually need. Knowing the landscape of available data can help inform decision making for a first draft of the standard. This phase is the data audit.

The first step is to categorize datasets into concepts that really make sense from both a data and a procurement point of view. In doing so, we began consolidating datasets that we would subject to a deep analysis. We worked with 7 specific contracting datasets drawn from our Priority Countries list: Canada, UK, Mexico, Georgia, Moldova, EU and UNOPS.

The datasets are available to view in CDC Open Contracting Data Field Landscape.

We created column headings to show the progress from original dataset names to standard ones we would be using in our refinement process. This was especially important when handling non-English datasets and portals.

We constructed a field data name convention to be used across all datasets (e.g.,  lower case, separated by underscores). Throughout our analysis, we realized some datasets were had aggregated information, not individual contracts, and included new datasets subject to a deep analysis. We also decided to focus on three of the open contracting process phases: tender, award and contract. The next section, datamap concepts, details our process.

Datamap concepts: Contracting phases

After many discussions about concepts and all phases of public procurement, we decided to focus on 3 of the 6 phases of contracting we agreed upon: Tender, Award and Contract. Having an in-depth analysis of the 7 datasets aided that decision. Most data found in the prioritized datasets was related to these particular phases.

All phases are listed below. They include field names that are linked to each phase.

1. Planning

2. Tender (sprint focus)

3. Award (sprint focus)

4. Contract (sprint focus)

5. Performance

6. Close

Each dataset included the following columns: Original Name, English Name, Formatted Name, Standard Name, Description of Field, English Translation of Description. These were created to have clear links from the original dataset names to names that are finalized for CDC.

We decided to categorize the dataset information in linked concepts: Phase – Entity – Group. All field names we found in the datasets are integrated in these 3 linked concepts.

e.g., tender (phase) – solicitation (entity) – tender_status (group)

We also added information about the field names in terms of ‘Allowable Values’ and ‘Data Types’ (shown in detail below). Allowable Values are constrictions to free text inputted and exclusive parameters given.  For example, the field name “tender_openness” can only be Open, Selective or Limited; these terms are the allowable values.

Data Types represent the format of the data in the following categories:

In addition to the in depth analysis of these datasets, we brainstormed visualizations for dataset comparisons across countries. An example of a potential target comparative capability would be to have visualization capabilities across jurisdictions, as rendered in Georgia’s procurement portal.

Epilogue

The discussions and analysis of the contracting datasets proved to be a valuable learning experience. After the sprints, we had learned that we needed to consolidate a part of this work, because there was too much detail in the phases we focused on (e.g., tender, award, contract) and not enough clarity. We should keep in mind that it was necessary to go through this process in order to achieve clarity on contract phases and the type of information that needs to be included for the standard. We aim to find a balance between simplicity without too little detail.

The next blog post will discuss the changes and progress made since the PyCon Sprint.

Related Stories