Heroku
The original Ohana implementation of Open Referral runs on Heroku, and we are looking to develop additional implentations that will deploy and operate via the Heroku cloud platform, and even potentially connect with SalesForce along the way.
Open Referral is platform agnostic. We are not invested in Human Service Data Specific (HSDS) compliant implementations running on any single platfrom, or vendor. Our mission is that any human services implementation can run anywhere, and these are the platforms we are currently targeting with implementations and projects.
The original Ohana implementation of Open Referral runs on Heroku, and we are looking to develop additional implentations that will deploy and operate via the Heroku cloud platform, and even potentially connect with SalesForce along the way.
The Human Services API prototype in PHP currently runs on Amazon leveraging EC2, S3, and RDS for the backend. We are looking to develop additional solutions that run on AWS, and leverage a variety of AWS solutions like Labmda, API Gateway, and more.
We have begun to explore what it would take to design, deploy, and manage human services API implementations on the Azure platform. We'd like to have a working prototype, as well as specific implementations on Azure that we can showcase and help support future efforts on the cloud provider.
We have also begun explore what it would take to design, deploy, and manage human service API implementations on the Google platform. We are looking for investment, developers, and potential implementations that would benefit from running on the Google platform. There are a wealth of services available on the platform, that the communitiy can put to use when providing access to data and other resources.
If there is another platform you'd like to see a Human Services Data Specification (HSDS) implementation target, please let us know. We are considering investing in platforms like Salesforce, WordPress, and others, but we don't like to start new projects without some support from a real world implementation and partners.
In addition to the platform projects, and the PHP demo, we are regularly investing in new open source tools, as well has help seed and encourage new projects developed on top of the Human Services Data API (HDSA), helping service providers better serve their constituents.
Ohana is the original implementation of the Human Services Data Specification (HSDS), which began as a Code for America project. The project is still an active open source project with a number of forks and implementations. Ohana is a Ruby implementation, that runs on Heroku, but can be deployed anywhere.
An open source tool developed by Chris Spilio, for validating the Open Referral schema, providing a dynamic registry of resource types based on latest schema, a visualization of a resource schema - this includes fields, keys, types, formats, etc., download sample CSV templates of resources, and upload and validation of a sample CSV file. A great example of the Open Referral community in action.
The Link-SF project was forked by Optimizely, and made to be HSDS compliant, decoupling the front-end from the backend, and leverage Github. I took it a step further and made the front-end run 100% on Github, making the project forkable and usable in any city.
Don't let the existing project's in motion limit your imagination. The municipalities that put the HSDA specification to work can benefit from a wide range of tooling that supports the format. While we encourage the development of platform, server, and other applications. Here are some of other ideas that we'd like to see out there that might stimulate the communiteis imagination, and see get built.
We are currently using Read the Docs, as well as OpenAPI and Swagger UI for documentation of the schema and API. We welcome new implementations that leverage the good looks of Read the Docs, but also combines with the machine readability and interactive nature of OpenAPI and Swagger UI.
Open Referral is in need of sample data sets that can be easily uploaded or imported into test, demo, and playground editions of the HSDA specification. Helping create platforms that can be played with, explored, and teaching people about what is possible with as real as data as possible.
With the recent hurricanes in Florida and Texas, there is a significant amount of momentum when it comes to developing websites, web and mobile applications that can assist people in a time of need. Take a look at some of what has been developed already, and spend some time thinking what might help folks in the future.
Ensuring that organizations, municipalities, and vendors are all speaking the same language is pretty critical to all of this working. The need for schema validation in a variety of languages, and platforms would help folks make sure their schema is compliant. The OpenAPI for HSDA has a JSON schema portion of the specification, which can be used to validate API requests and responses.
If there is an idea you'd like to see developed, feel free to reach out and I am happy to add to the list. We are looking to learn from existing human services implementations and understand what is needed on the ground.
HSDA is an API specification for managing a core set of resource (organizations, locations, and services), but there are a handful of other projects that add to the work that has is going around HSDA. These are the additional specification in prgoress, hoping to provide a buffet of services that any government, or organization can deploy.
A project to develop search capabilities on top of HSDA implementations providing advanced capabilities for searching across all resources, for a wide variety of applications.
A project dedicated to supporting bulk level operations around human services data, handling large adding, updating, and migrating of data. Including the usage of jobs to help reduce the load on core HSDA systems.
A separate project for managing taxonomy related HSDA data, including AIRS, and Open Eligibility, but also allowing for custom taxnomy. Supports the browsing and searching of HSDA data using supported taxonomy.
A project for supporting the orchestration of HSDA data, including supporting of events, webhooks, and other features allowing HSDA implementations to push and pull data based upon a schedule, and events.
A project for handling the underlying logging and tracking of each API call made to any HSDA implementation. The HSDA Meta service can work with other systems to keep track of what happens on the platform, and leverage as a transaction system for triggering, rolling forward, and rolling backwards anything that occurs.
>
A project for handling the management and access of HSDA implementations, providing a basic API management layer that handles access, and permissions for working with any HSDA API.
A project for handling all utility APIs involved with working with HSDA services and data. Currently providing service and path management, as well as validation of schema and APIs.
While these are potential HSDA projects, they are moving forward as separate services, that are designed to compliment, and augment a core set of HSDA implementations. They are versioned, and managed independently, allowing them to be deployed, scaled, and managed indepently of core HSDA implementations.
One area I study as the API Evangelist is around the monetization, and access plans of leading API providers--studying how they are generating revenue around API consumption and even partnerships. I'm looking to extend this work to the human services industry, helping providers generate much needed revenue from their hard work, while still serving their consistuents. Here is an aggregation of all my stories regarding monetization of public data, and how this applies to my Open Referral work.
Some of my original thoughts around the government generating revenue using public data, breaking from the curent thoughts that all public data should just be freely available for anyone to download and use in a commercial capacity.
Continuing my work to help the federal government think about generating revenue using public data and APIs. This time it was specifically regarding the Recreational Information Database (RIDB), and how the Department of Interior, and Forest Service can rethink digital resources alonngside how they generate revenue from other physical government resources.
This is an essay I wrote doing a deep dive on the subject. I'm currently reworking this as a blueprint for how I am approaching monetization around human services data. It will become the basis for some future discussions I am facilitating around helping 211 and human service providers generate sensible and fair revenue around their operations--helping them stay available, and offering their consituents a valuable service.
Continuing my thoughts on generating revenue from public data, I am actually exploring how the future of tax revenue at the municipal, state, or federal level can be conducted using APIs, and modern approaches to API management. This is all very new thoughts, but I'm looking to keep the conversation, preparing for the future of digital government.
We are just getting started with the discussions around how human service providers can generate revenue using the public data they manage, while also acknowledging other commercial vendors also generate sensible and fair revenue by providing products, services, and tooling to providers. If you have an experience and thoughts to share, feel free to get involved--we'd love to hear your thoughts.