Recap of Previous Posts Part 1 - Overview of the Public Procurement TED dataset Part 2 - Identification of Entities involved in procurement Part 3 - Attribution of GHG Emissions using the CPV classification In the earlier parts of this series we motivated and defined the scope of our exploration of Public Procurement data, we dug deeper into constructing economic representations of the public procurement process. We also linked procurement entities to private sector sellers.
Recap of Previous Posts Part 1 - Overview of the Public Procurement TED dataset Part 2 - Identification of Entities involved in procurement In the first part of this series we motivated and defined the scope of a study explores Public Procurement data. In the second instalment we dug deeper into an important facet of the data, with the aim of constructing a meaningful economic representation of the public procurement process.
Recap of Previous Post Part 1 - Overview In the first part of this series we motivated and defined the scope of a study that explores Public Procurement data. We discussed the meaning of the main relevant terms (Open Data, Open Source, Green Public Procurement) and briefly reviewed the current state and challenges of the latter in EU context. Further, we took a first look into the EU’s TED Database (which is the main source of data) and highlighted some key statistics which bring to light information such as: size of the dataset, overall structure and some data quality aspects.
Introduction In a series of posts we will explore the role of Open Data and Open Source in enabling and accelerating the broad based effort towards Green Public Procurement (GPP). There are several important (and possibly obscure or “buzzwordy”) terms in the above sentence, so the first order of business will be to unpack them. Let us start with the term Public Procurement which will be the main domain of interest in this study.
What is the EU Datathon? The EU Datathon is an annual Open Data competition organised by the Publications Office of the European Union since 2017. The competitions are organised to create new value for citizens through innovation and promoting the use of open data, in particular the datasets available on the official portal for European data. Every year, EU Datathon calls for innovators from around the world to come up with new ways of using open data to address important societal and environmental challenges, with the condition that they use at least one of the thousands of data sets published on data.
Equinox is an open source platform that supports holistic risk management and reporting in the context of Sustainable Portfolio Management. The platform integrates geospatial information with applicable regulatory and industry standards, for example the GHG Protocol (accounting for Project based, Corporate and City-Wide greenhouse gas emissions), the IPCC Emissions Factor database and further reference data, the PCAF attribution methodologies (and more) to provide a holistic view of the footprint of both individual projects and portfolios.
Motivation Fig 1. An economic network as a graph. The economy is a complex tangle of various agents that interact via transactions (sales and purchases) and contracts (lending, investing). In recent times more and more techniques from graph theory and network science are brought to bear on economic analysis. On the other hand, ever since the seminal contributions of Leontief, Input-Output Models (IO) have been widely used to describe economic relationships between economic actors (e.
Equinox is an open source platform that supports the holistic risk management and reporting of major sustainable finance projects (the financing of projects with material physical footprint) such as project finance. Equinox aims to integrate in the database a number reference databases that facilitate tasks of sustainable portfolio management. In the current focus such reference material concerns the emissions factors for various processes and activities. In the latest (Solstice Day!) update of the Equinox Project we discuss the integration of reference data an in particular greenhouse gas emissions factors as catalogued in the IPCC Emissions Factors database (EFDB).
The frontpage graphic is adapted from Steffen et al. “Planetary Boundaries: Guiding human development on a changing planet". Science (2015). The Planetary Boundaries concept was proposed in 2009 by this group of Earth system and environmental scientists. The group suggested that finding a “safe operating space for humanity” is a precondition for sustainable development. The framework is based on scientific evidence that human actions since the Industrial Revolution have become the main driver of global environmental change.
Summary: The Open Risk Academy course NPL270672 is a CrashCourse introducing the EBA NPL Templates. Content: We start with the motivation for the templates and the domain of credit data (to which NPL data belongs). We discuss three core classes that capture the essence of lending operations from a lenders point of view (Counterparty, Loan, Collateral). Next we explore classes that capture events in the lending relationship lifecycle (which we term NPL Scenarios).
Equinox is an open source platform that supports holistic risk management and reporting of Sustainable Finance (Sustainable Portfolio Management). The platform integrates geospatial information with applicable regulatory and industry standards from EBA, PCAF and Equator Principles to provide a holistic view of the footprint of both individual projects and portfolios, in particular of project finance investments. Motivation Sustainability (understood in environmental, economic and social terms) is emerging as an undisputed constraint that will shape future human activity and more specifically how the financial system facilitates and empowers economic life.
Celebrating Pi Day 2021 Pi Day is celebrated every year on March 14th. The reason of course is that the day is denoted in some calendars as (3/14), which evokes of 3.14, the first three digits of “π”. A thin excuse maybe but sufficient for the true believers to join along! The occasion represents an annual opportunity for mathematics and science enthusiasts to recite the infinite charms of Pi, including its irrationality, to talk to friends and family about math and its uses, and, when everything else fails, simply eat pie.
What is the future of stress testing? To speculate on the future of Stress Testing we need first a basic definition what stress testing is. Broadly speaking, the goal of Stress Testing is to assess how a system would behave under adverse conditions that - while not the most likely outcome with the knowledge of today - are within the realm of the plausible. There are, broadly speaking, two types of stress testing: The Real stress testing version and Hypothetical stress testing version.
The role of simulation in risk management and decision support A Simulation is a simplified imitation of a process or system that represents with some fidelity its operation over time. In the context of risk management and decision support simulation can be a very powerful tool as it allows us to assess potential outcomes in a systematic way and explore what-if questions in ways that might otherwise be not feasible. Simulation is used when the underlying model is too complex to yield explicit analytic models (An analytic model is one can be “solved” exactly or with standard numerical methods, for example resulting in a formula).
openNPL 0.2 release The open source openNPL platform supports the management of standardized credit portfolio data for non-performing loans. In this respect it implements the detailed European Banking Authority NPL loan templates. It aims to be at the same time easy to integrate in human workflows (using a familiar web interface) and integrate into automated (computer driven) workflows. The latest (0.2) release exposes a REST API that offers machine oriented access using, what is by now, the most established mechanism for achieving flexible online data transfers.
openNPL now Available in Dockerized Form Following up on the first release of openNPL the platform is now available to install using Docker. Running openNPL via docker is the installation option that simplifies the manual process (but a working docker installation is required!). Docker Hub You can pull the latest openNPL image from Docker Hub (This method is recommended if you do not want to mess with the source distribution).
Non-Performing Loans: The covid-19 crisis will certainly impact the concentration of Non-Performing Loans but given the special nature of this economic crisis compared (in particular) with the 2008 financial crisis it is unclear how precisely things will evolve. In a previous post and white paper (OpenRiskWP07_022616) we discussed the importance of advancing open and transparent methodologies for managing the risks associated with such credit portfolios. Effective management of NPL is also a top regulatory priority.
Limit frameworks are fundamental tools for risk management: A Limit Framework is a set of policies used by financial institutions (or other firms that actively assume quantifiable risks) to govern in a quantitative manner the maximum risk exposure permitted for an individual, trading desk, business line etc. Why do we need limit frameworks? A limit framework is expressing in concrete terms the Risk Appetite of an institution to assume certain risks.
Visualization of large scale economic data sets: Economic data are increasingly being aggregated and disseminated by Statistics Agencies and Central Banks using modern API’s (application programming interfaces) which enable unprecedented accessibility to wider audiences. In turn the availability of relevant information enables more informed decision making by a variety of actors in both public and private sectors. An excellent example of such a modern facility is the European Central Bank’s Statistical Data Warehouse (SDW), an online economic data repository that provides features to access, find, compare, download and share the ECB’s published statistical information.
The challenge with historical credit data: Historical credit data are vital for a host of credit portfolio management activities: Starting with assessment of the performance of different types of credits and all the way to the construction of sophisticated credit risk models. Such is the importance of data inputs that for risk models impacting significant decision making / external reporting there are even prescribed minimum requirements for the type and quality of necessary historical credit data.