RSS

API Discovery News

These are the news items I've curated in my monitoring of the API space that have some relevance to the API definition conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Patent US9639404: API Matchmaking Using Feature Models

Here is another patent in my series of API related patents. I’d file this in the category as the other similar one from IBM–Patent US 8954988: Automated Assessment of Terms of Service in an API Marketplace. It is a good idea. I just don’t feel it is a good patent idea.

Title: API matchmaking using feature models Number: 09454409 Owner: International Business Machines Corporation Abstract: Software that uses machine logic based algorithms to help determine and/or prioritize an application programming interface’s (API) desirability to a user based on how closely the API’s terms of service (ToS) meet the users’ ToS preferences. The software performs the following steps: (i) receiving a set of API ToS feature information that includes identifying information for at least one API and respectively associated ToS features for each identified API; (ii) receiving ToS preference information that relates to ToS related preferences for a user; and (iii) evaluating a strength of a match between each respective API identified in the API ToS feature information set and the ToS preference information to yield a match value for each API identified in the API ToS feature information set. The ToS features include at least a first ToS field. At least one API includes multiple, alternative values in its first ToS field.

Honestly, I don’t have a problem with a company turning something like this into a feature, and even charging for it. I just wish IBM would help us solve the problem of making terms of service machine readable, so something like this is even possible. Could you imagine what would be possible if everybody’s terms of service were machine readable, and could be programmatically evaluated? We’d all be better off, and matchmaking services like this would become a viable service.

I just wish more of the energy I see go into these patent would be spent actually doing things in the API space. Providing low cost, innovative API services that businesses can use, instead of locking up ideas, filing them away with the government, so that they can be used at a later date in litigation and backdoor dealings.


Publishing Your API In The AWS Marketplace

I’ve been watching the conversation around how APIs are discovered since 2010 and I ave been working to understand where things might be going beyond ProgrammableWeb, to the Mashape Marketplace, and even investing in my own API discovery format APIs.json. It is a layer of the API space that feels very bipolar to me, with highs and lows, and a lot of meh in the middle. I do not claim to have “the solution” when it comes to API discovery and prefer just watching what is happening, and contributing where I can.

A number interesting signals for API deployment, as well as API discovery, are coming out of Amazon Marketplace lately. I find myself keeping a closer eye on the almost 350 API related solutions in the marketplace, and today I’m specifically taking notice of the Box API availability in the AWS Marketplace. I find this marketplace approach to not just API discovery via an API marketplace, but also API deployment very interesting. AWS isn’t just a marketplace of APIs, where you find what you need and integrate directly with that provider. It is where you find your API(s) and then spin up an instance within your AWS infrastructure that facilitates that API integration–a significant shift.

I’m interested in the coupling between API providers and AWS. AWS and Box have entered into a partnership, but their approach provides a possible blueprint for how this approach to API integration and deployment can scale. How tightly coupled each API provider chooses to be, looser (proxy calling the API), or tighter (deploying API as AMI), will vary from implementation to implementation, but the model is there. The Box AWS Marketplace instance dependencies on the Box platform aren’t evident to me, but I’m sure they can easily be quantified, and something I can get other API providers to make sure and articulate when publishing their API solutions to AWS Marketplace.

AWS is moving towards earlier visions I’ve had of selling wholesale editions of an API, helping you manage the on-premise and private label API contracts for your platform, and helping you explore the economics of providing wholesale editions of your platforms, either tightly or loosely coupled with AWS infrastructure. Decompiling your API platform into small deployable units of value that can be deployed within a customer’s existing AWS infrastructure, seamlessly integrating with existing AWS services.

I like where Box is going with their AWS partnership. I like how it is pushing forward the API conversation when it comes to using AWS infrastructure, and specifically the marketplace. I’ll keep an eye on where things are going. Box seems to be making all the right moves lately by going all in on the OpenAPI Spec, and decompiling their API platform making it deployable and manageable from the cloud, but also much more modular and usable in a serverless way. Providing us all with one possible blueprint for how we handle the technology and business of our API operations in the clouds.


The APIs.json For Trade.gov

There are a growing number of API providers who have published an APIs.json for their API operations, providing a machine-readable index of not just their API, but for their API entire operations. My favorite example to use in my talks and conversations when I’m showcasing the API discovery format is the one for the International Trade Administration at developer.trade.gov.

The International Trade Administration (ITA) is the government agency that “strengthens the competitiveness of U.S. industry, promotes trade and investment, and ensures fair trade through the rigorous enforcement of our trade laws and agreements”, provides an index of where you can find their developer portal, documentation, terms of service, as well as a machine readable OpenAPI for their trade APIs.

I couldn’t think of a more shining example of APIs when it comes to talking about the API economy. I am pleased to have helped influenced their API efforts and helping them see the importance of providing a machine readable index of their API operations with APIs.json, as well as their APIs using OpenAPI. If you need a well maintained, and meaningful example of how APIs.json works head over to developer.trade.gov and take a look.


The List Of API Signals I Track On In My API Stack Research

I keep an eye on several thousand companies as part of my research into the API space and publish over a thousand of these profiles in my API Stack project. Across the over 1,100 companies, organizations, institutions, and government agencies I'm regularly running into a growing number of signals that tune me into what is going on with each API provider, or service provider. 

Here are the almost 100 types of signals I am tuning into as I keep an eye on the world of APIs, each contributing to my unique awareness of what is going on with everything API.

  • Account Settings (x-account-settings) - Does an API provider allow me to manage the settings for my account?
  • Android SDK (x-android-sdk) - Is there an Android SDK present?
  • Angular (x-angularjs) - Is there an Angular SDK present?
  • API Explorer (x-api-explorer) - Does a provider have an interactive API explorer?
  • Application Gallery (x-application-gallery) - Is there a gallery of applications build on an API available?
  • Application Manager (x-application-manager) - Does the platform allow me to management my APIs?
  • Authentication Overview (x-authentication-overview) - Is there a page dedicated to educating users about authentication?
  • Base URL for API (x-base-url-for-api) - What is the base URL(s) for the API?
  • Base URL for Portal (x-base-url-for-portal) - What is the base URL for the developer portal?
  • Best Practices (x-best-practices) - Is there a page outlining best practices for integrating with an API?
  • Billing history (x-billing-history) - As a developer, can I get at the billing history for my API consumption?
  • Blog (x-blog) - Does the API have a blog, either at the company level, but preferably at the API and developer level as well?
  • Blog RSS Feed (x-blog-rss-feed) - Is there an RSS feed for the blog?
  • Branding page (x-branding-page) - Is there a dedicated branding page as part of API operations?
  • Buttons (x-buttons) - Are there any embeddable buttons available as part of API operations.
  • C# SDK (x-c-sharp) - Is there a C# SDK present?
  • Case Studies (x-case-studies) - Are there case studies available, showcasing implementations on top of an API?
  • Change Log (x-change-log) - Does a platform provide a change log?
  • Chrome Extension (x-chrome-extension) - Does a platform offer up open-source or white label chrome extensions?
  • Code builder (x-code-builder) - Is there some sort of code generator or builder as part of platform operations?
  • Code page (x-code-page) - Is there a dedicated code page for all the samples, libraries, and SDKs?
  • Command Line Interface (x-command-line-interface) - Is there a command line interface (CLI) alongside the API?
  • Community Supported Libraries (x-community-supported-libraries) - Is there a page or section dedicated to code that is developed by the API and developer community?
  • Compliance (x-compliance) - Is there a section dedicated to industry compliance?
  • Contact form (x-contact-form) - Is there a contact form for getting in touch?
  • Crunchbase (x-crunchbase) - Is there a Crunchbase profile for an API or its company?
  • Dedicated plans pricing page (x-dedicated-plans--pricing-page)
  • Deprecation policy (x-deprecation-policy) - Is there a page dedicated to deprecation of APIs?
  • Developer Showcase (x--developer-showcase) - Is there a page that showcases API developers?
  • Documentation (x-documentation) - Where is the documentation for an API?
  • Drupal (x-drupal) - Is there Drupal code, SDK, or modules available for an API?
  • Email (x-email) - Is an email address available for a platform?
  • Embeddable page (x-embeddable-page) - Is there a page of embeddable tools available for a platform?
  • Error response codes (x-error-response-codes) - Is there a listing or page dedicated to API error responses?
  • Events (x-events) - Is there a calendar of events related to platform operations?
  • Facebook (x-facebook) - Is there a Facebook page available for an API?
  • Faq (x-faq) - Is there an FAQ section available for the platform?
  • Forum (x-forum) - Does a provider have a forum for support and asynchronous conversations?
  • Forum rss (x-forum-rss) - If there is a forum, does it have an RSS feed?
  • Getting started (x-getting-started) - Is there a getting started page for an API?
  • Github (x-github) - Does a provider have a Github account for the API or company?
  • Glossary (x-glossary) - Is there a glossary of terms available for a platform?
  • Heroku (x-heroku) - Are there Heroku SDKs, or deployment solutions?
  • How-To Guides (x-howto-guides) - Does a provider offer how-to guides as part of operations?
  • Interactive documentation (x-interactive-documentation) - Is there interactive documentation available as part of operatoins?
  • IoS SDK (x-ios-sdk) - Is there an IoS SDK for Objective-C or Swift?
  • Issues (x-issues) - Is there an issue management page or repo for the platform?
  • Java SDK (x-java) - Is there a Java SDK for the platform?
  • JavaScript API (x-javascript-api) - Is there a JavaScript SDK available for a platform?
  • Joomla (x-joomla) - Is there Joomla plug for the platform?
  • Knowledgebase (x-knowledgebase) - Is there a knowledgebase for the platform?
  • Labs (x-labs) - Is there a labs environment for the API platform?
  • Licensing (x-licensing) - Is there licensing for the API, schema, and code involved?
  • Message Center (x-message-center) - Is there a messaging center available for developers?
  • Mobile Overview (x-mobile-overview) - Is there a section or page dedicated to mobile applications?
  • Node.js (x-nodejs) - Is there a Node.js SDK available for the API?
  • Oauth Scopes (x-oauth-scopes) - Does a provider offer details on the available OAuth scopes?
  • Openapi spec (x-openapi-spec) - Is there an OpenAPI available for the API?
  • Overview (x-overview) - Does a platform have a simple, concise description of what they do?
  • Paid support plans (x-paid-support-plans) - Are there paid support plans available for a platform?
  • Postman Collections (x-postman) - Are there any Postman Collections available?
  • Partner (x-partner) - Is there a partner program available as part of API operations?
  • Phone (x-phone) - Does a provider publish a phone number?
  • PHP SDK (x-php) - Is there a PHP SDK available for an API?
  • Privacy Policy (x-privacy-policy-page) - Does a platform have a privacy policy?
  • PubSub (x-pubsubhubbub) - Does a platform provide a PubSub feed?
  • Python SDK (x-python) - Is there a Python SDK for an API?
  • Rate Limiting (x-rate-limiting) - Does a platform provide information on API rate limiting?
  • Real Time Solutions (x-real-time-page) - Are there real-time solutions available as part of the platform?
  • Road Map (x-road-map) - Does a provider share their roadmap publicly?
  • Ruby SDK (x-ruby) - Is there a Ruby SDK available for the API?
  • Sandbox (x-sandbox) - Is there a sandbox for the platform?
  • Security (x-security) - Does a platform provide an overview of security practices?
  • Self-Service registration (x-self-service-registration) - Does a platform allow for self-service registration?
  • Service Level Agreement (x-service-level-agreement) - Is an SLA available as part of platform integration?
  • Slideshare (x-slideshare) - Does a provider publish talks on Slideshare?
  • Stack Overflow (x-stack-overflow) - Does a provider actively use Stack Overflow as part of platform operations?
  • Starter Projects (x-starter-projects) - Are there start projects available as part of platform operations?
  • Status Dashboard (x-status-dashboard) - Is there a status dashboard available as part of API operations.
  • Status History (x-status-history) - Can you get at the history involved with API operations?
  • Status RSS (x-status-rss) - Is there an RSS feed available as part of the platform status dashboard?
  • Support Page (x-support-overview-page) - Is there a page or section dedicated to support?
  • Terms of Service (x-terms-of-service-page) - Is there a terms of service page?
  • Ticket System (x-ticket-system) - Does a platform offer a ticketing system for support?
  • Tour (x-tour) - Is a tour available to walk a developer through platforms operations?
  • Trademarks (x-trademarks) - Is there details about trademarks, and how to use them?
  • Twitter (x-twitter) - Does a platform have a Twitter account dedicated to the API or even company?
  • Videos (x-videos) - Is there a page, YouTube, or other account dedicated to videos about the API?
  • Webhooks (x-webhook) - Are there webhooks available for an API?
  • Webinars (x-webinars) - Does an API conduct webinars to support operations?
  • White papers (x-white-papers) - Does a platform provide white papers as part of operations?
  • Widgets (x-widgets) - Are there widgets available for use as part of integration?
  • Wordpress (x-wordpress) - Are there WordPress plugins or code available?

There are hundreds of other building blocks I track on as part of API operations, but this list represents the most common, that often have dedicated URLs available for exploring, and have the most significant impact on API integrations. You'll notice there is an x- representation for each one, which I use as part of APIs.json indexes for all the APIs I track on. Some of these signal types are machine readable like OpenAPIs or a Blog RSS, with others machine readable because there is another API behind, like Twitter or Github, but most of them are just static pages, where a human (me) can visit and stay in tune with signals.

I have two primary objectives with this work: 1) identify the important signals, that impact integration, and will keep me and my readers in tune with what is going on, and 2) identify the common channels, and help move the more important ones to be machine-readable, allowing us to scale the monitoring of important signals like pricing and terms of service. My API Stack research provides me wit a nice listing of APIs, as well as more individualized stacks like Microsoft, Google, Microsoft, and Facebook, or even industry stacks like SMS, Email, and News. It also provides me with a wealth of signals we can tune into better understand the scope and health of the API sector, and any individual business vertical that is being touched by APIs.


Expressing What An API Does As Well As What Is Possible Using OpenAPI

I am working to update my OpenAPI definitions for AWS, Google, and Microsoft using some other OpenAPIs I've discovered on Github. When a new OpenAPI has entirely new paths available, I just insert them, but when it has an existing path I have to think more critically about what is next. Sometimes I dismiss the metadata about the API path as incomplete or lower quality than the one I have already. Other times the content is actually more superior than mine, and I incorporate it into my work. Now I'm also finding that in some cases I want to keep my representation, as well as the one I discovered, side by side--both having value.

This is one reason I'm not 100% sold on the fact that just API providers should be crafting their own OpenAPis--sure, the API space would be waaaaaay better if ALL API providers had machine readable OpenAPIs for all their services, but I would want it to end here. You see, API providers are good (sometimes) at defining what their API does, but they often suck at telling you what is possible--which is why they are doing APIs. I have a lot of people who push back on me creating OpenAPIs for popular APIs, telling me that API providers should be the ones doing the hard work, otherwise it doesn't matter. I'm just not sold that this is the case, and there is an opportunity for evolving the definition of an API by external entities using OpenAPI.

To help me explore this idea, and push the boundaries of how I use OpenAPI in my API storytelling, I wanted to frame this in the context of the Amazon EC2 API, which allows me to deploy a single unit of compute into the cloud using an API, a pretty fundamental component of our digital worlds. To make any call against the Amazon EC2 I send all my calls to a single base URL:

ec2.amazonaws.com

With this API call I pass in the "action" I'd like to be taken:

?Action=RunInstances

Along with this base action parameter, I pass in a handful of other parameters to further define things:

&ImageId=ami-60a54009&MaxCount=1&KeyName=my-key-pair&Placement.AvailabilityZone=us-east-1d

Amazon has never been known for superior API design, but it gets the job done. With this single API call I can launch a server in the clouds. When I was first able to do this with APIs, is when the light really went on in my head regarding the potential of APIs. However, back to my story on expressing what an API does, as well as what is possible using OpenAPI. AWS has done an OK job at expressing what Amazon EC2 API does, however they suck at expressing what is possible. This is where API consumers like me step up with OpenAPI and provide some alternative representations of what is possible with the highly valuable API.

When I define the Amazon EC2 API using the OpenAPI specification I use the following:

swagger: '2.0'
info:
title: Amazon EC2
host: ec2.amazonaws.com
paths:
/:
     get:
          summary: The Amazon EC2 service
          operationId: ec2API
     parameters:
          - in: query
            name: action

The AWS API design pattern doesn't lend itself to reuse when it comes to documentation and storytelling, but I'm always looking for an opportunity to push the boundaries, and I'm able to better outline all available actions, as individual API paths by appending the action parameter to the path:

swagger: '2.0'
info:
title: Amazon EC2
host: ec2.amazonaws.com
paths:
/?Action=RunInstances/:
     get:
          summary: Run a new Amazon EC2 instance
          operationId: runInstance

Now I'm able to describe all 228 actions you can take with the single Amazon EC2 API path as separate paths in any OpenAPI generated API documentation and tooling. I can give them unique summaries, descriptions, and operationId. OpenAPI allows me to describe what is possible with an API, going well beyond what the API provider was able to define. I've been using this approach to better quantify the surface area of APIs like Amazon, Flickr, and others who use this pattern for a while now, but as I was looking to update my work, I wanted to take this concept even further.

While appending query parameters to the path definition has allowed me to expand how I describe the surface area of an API using OpenAPI, I'd rather keep these parameters defined properly using the OpenAPI specification, and define an alternative way to make the path unique. To do this, I am exploring the usage of #bookmarks, to help make duplicate API paths more unqiue in the eyes of the schema validators, but invisible to the server side of things--something like this:

swagger: '2.0'
info:
title: Amazon EC2
host: ec2.amazonaws.com
paths:
/#RunInstance/:
     get:
          summary: Run a new Amazon EC2 instance
          operationId: runInstance
  parameters:
     - in: query
               name: action
               default: RunInstances 

I am considering how we can further make the path unique, by predefining other parameters using default or enum:

swagger: '2.0'
info:
title: Amazon EC2
host: ec2.amazonaws.com
paths:
/#RunWebSiteInstance/:
     get:
          summary: Run a new Amazon EC2 website instance
          description: The ability to launch a new website running on its own Amazon EC2 instance, from a predefined AWS AMI. 
          operationId: runWebServerInstance
  parameters:
     - in: query
               name: action
               default: RunInstances
 
     - in: query
               name: ImageId
               default: ami-60a54009
 

I am still drawing in the lines of what the API provider has given me, but I'm now augmenting with a better summary and description of what is possible using OpenAPI, which can now be reflected in documentation and other tooling that is OpenAPI compliant. I can even prepopulate the default values, or available options using enum settings, tailoring to my team, company, or other specific needs. Taking an existing API definition beyond its provider interpretation of what it does, and getting to work on being more creative around what is possible.

Let me know how incoherent this is. I can't tell sometimes. Maybe I need more examples of this in action. I feel like it might be a big piece of the puzzle that has been missing for me regarding how we tell stories about what is possible with APIs. When it comes to API definitions, documentation, and discovery I feel like we are chained to a provider's definition of what is possible, when in reality this shouldn't be what drives the conversation. There should be definitions, documentation, and discovery documents created by API providers that help articulate what an API does, but more importantly, there should be a wealth of definitions, documentation, and discovery documents created by API consumers that help articulate what is possible. 


Thinking About Schema.org's Relationship To API Discovery

I was following the discussion around adding a WebAPI class to Schema.org's core vocabulary, and it got me to think more about the role Schema.org has to play with not just our API definitions, but also significantly influencing API discovery. Meaning that we should be using Schema.org as part of our OpenAPI definitions, providing us with a common vocabulary for communicating around our APIs, but also empowering the discovery of APIs. 

When I describe the relationship between Schema.org to API discovery, I'm talking about using the pending WebAPI class, but I'm also talking about using common Schema.org org within API definitions--something that will open the definitions to discovery because it employs a common schema. I am also talking about how do we leverage this vocabulary in our HTML pages, helping search engines like Google understand there is an API service available:

I will also be exploring how I can better leverage Schema.org in my APIs.json format, better leveraging a common vocabulary describing API operations, not just an individual API. I'm looking to expand the opportunities for discovering, not limit them. I would love all APIs to take a page from the hypermedia playbook, and have a machine readable index for each API, with a set of links present with each response, but I also want folks to learn about APIs through Google, ensuring they are indexed in a way that search engines can comprehend.

When it comes to API discovery I am primarily invested in APIs.json (because it's my baby) describing API operations, and OpenAPI to describe the surface area of an API, but I also want this to map to the very SEO driven world we operate in right now. I will keep investing time in helping folks use Schema.org in their API definitions (APIs.json & OpenAPI), but I will also start investing in folks employing JSON+LD and Schema.org as part of their search engine strategies (like above), making our APIs more discoverable to humans as well as other systems.


Mapping Github Topics To My API Evangelist Research

I was playing around with the new Github topics, and found that it provides an interesting look at the API space, one that I'm hoping will continue to evolve, and maybe I can influence.

I typed 'api-' into Github's topic tagging tool for my repository, and after I tagged each of my research areas with appropriate tags, I set out exploring these layers of Github by clicking on each tag. It is something that became quite a wormhole of API exploration.

I had to put it down, as I could spend hours looking through the repositories, but I wanted to create a machine-readable mapping to my existing API research areas, that I could use to regularly keep an eye on these slices of the Github pie--in an automated way.

Definitions - These are the topics I'm adding to my monitoring of the API space when it comes to API definitions. I thought it was interesting how folks are using Github to manage their API definitions.

I like how OpenAPI is starting to branch out into separate areas, as well as how this area touches on almost every other area liste here. I am going to work to help shape the tags present based on the definitions, templates, and tooling I find on Github in my work.

Design - There was only one API design related item, but is something I expect to expand rapidly as I dive into this area further.

I know of a number of projects that should be tagged and added to the area of API design, as well as have a number of sub-areas I'd like to see included as relevant API design tags.

Deployment - Deployment was a little tougher to get a handle on. There are many different ways to deploy an API, but these are the ones I've identified so far.

I know I will be adding some other areas to this area quickly. Tracking on database, containerized, and serverless approaches to API deployment.

Management - There were two topics that jumped out to me for inclusion in my API management research.

As with all the other areas, I will be harassing some of the common API management providers I know to tag their repositories appropriately, so that they show up in these searches.

Documentation - There are always a number of different perspectives on what constitutes API documentation, but these are a few of these I've found so far.

I think that API console overlaps with API clients, but it works here as well. I will work to find a way to separate out the documentation tools, from the documentation implementations.

SDK - It is hard to identify what is an SDK. It is a sector of the space I've seen renewed innovation, as well as bending of the definition of what is a development kit.

I will be looking to identify language-specific variations as part of this mapping to API SDKs available on Github, making discoverable through topic searches.

API Portal - It was good to see wicked.haufe.io as part of an API portal topic search. I know of a couple of other implementations that should be present, helping people see this growing area of API deployment and management.

This approach to providing Github driven API templates is the future of both the technical and business side of API operations. It is the seed for continuous integration across all stops along the API lifecycle.

API Discovery - Currently it is just my research in the API discovery topic search, but it is where I'm putting this area my work down. I was going to add all my research areas, but I think that will make for a good story in the future.

API discovery is one of the areas I'm looking to stimulate with this Github topics work. I'm going to be publishing separate repositories for each of the API I've profiled as part of my monitoring of the API space, and highlighting those providers who do it as well. We need more API providers to publish their API definitions to Github, making it available to be applied at every other stop along the API lifecycle.

I've long used Github as a discovery tool. Tracking on the Github accounts of companies, organizations, institutions, agencies and individuals is the best way to find the meaningful things going on with APIs. Github topics just adds another dimension to this discovery process, where I don't have to always do the discovery, and other people can tag their repositories, and they'll float up on the radar. Github repo activity, stars, and forks just give an added dimensions to this conversation.

I will have to figure out how to harass people I know about properly tagging their repos. I may even submit a Github issue for some of the ones I think are important enough. Maybe Github will allow users to tag other people's projects, adding another dimension to the conversation, while giving consumers a voice as well. I will update the YAML mapping for this project as I find new Github topics that should be mapped to my existing API research.


Discovering New APIs Through Security Alerts

I tune into a number of different channels looking for signs of individuals, companies, organizations, institutions, and government agencies doing APIs. I find APIs using Google Alerts, monitoring Twitter and Github, using press releases and via patent filings. Another way I am learning to discover APIs is via alerts and notifications about security events.

An example of this can be found via the Industrial Control Systems Cyber Emergency Response Team out of the U.S. Department of Homeland Security (@icscert), with the recent issued advisory ICSA-16-287-01 OSIsoft PI Web API 2015 R2 Service Acct Permissions Vuln to ICS-CERT website, leading me to the OSIsoft website. They aren't very forthcoming with their API operations, but this is something I am used to, and in my experience, companies who aren't very public with their operations tend to also cultivate an environment where security issue go unnoticed.

I am looking to aggregate API related security events and vulnerabilities like the feed coming out of Homeland Security. This information needs to be shared more often, opening up further discussion around API security issues, and even possibly providing an API for sharing real-time updates and news. I wish more companies, organizations, institutions, and government agencies would be more public with their API operations and be more honest about the dangers of providing access to data, content, and algorithms via HTTP, but until this is the norm, I'll continue using API related security alerts and notifications to find new APIs operating online.


What Is APIs.json? And What Is Next For the API Discovery Format?

As part of a renewed focus on the API discovery definition format APIs.json, I wanted to revisit the propsed machine readable API discovery specification, and see what is going on. First, what is APIs.json? It is a machine readable JSON specification, that anyone can use to define their API operations. APIs.json does not describe your APIs like OpenAPI Spec and API Blueprint do, it describes your surrounding API operations, with entries that can reference your Open API Spec, API Blueprint, or any other format that you desire.

APIs.json Is An Index For API Operations
APIs.json provides a machine readable approach that API providers can put work in describing their API operations, similar to how web site providers describe their websites using sitemap.xml. Here are the APIs, who are describing their APIs using APIs.json:

APIStrat Austin API
API Evangelist
Acuity Scheduling
BreezoMeter
CheckMarket
Clarify
Data Validation
DNS Check
Email Hunter
FeedbackHub
Fitbit
Gavagai
Kin Lane
Link Creation Studio
OneMusicAPI
Pandorabots API
Qalendra
RiteTag
Singlewire
SiteCapt
Social Searcher API
Super Monitoring
Timekit
Trade.gov
Twitch Bot Directory
EnClout
frAPI
Section.io
Spoonacular

APIs.json Indexes Can Be Created By 3rd Parties
One important thing to add, is that these APIs.json files can also be crafted, and published by external parties. An example of this is with the Trade.gov APIs. I originally created that APIs.json file, and coordinated with them to eventually it get published under their own domain, making it an authoritative APIs.json file. Many APIs.json files will be born outside of the API operations they describe, something you can see in my API stack project:

  • The API Stack - Provides almost 1000 APIs.json files, that describe the API operations of many leading public API platforms. There is also around 300 OpenAPI specifications, for some of the platforms described

APIs.json Can Be Used To Describe API Collections
Beyond describing a single API, within a single domain, APIs.json can also be used to describe entire collections of APIs, providing a machine readable way to organize, and share valuable collections of API resources. Here are a few examples of projects that are producing APIs.json driven collections.

APIs.json Can Be Used To Describe Collections of Collections
Then taking things up another rung up the chain, APIs.json can also provide a collection of collections, something I do with my own APIs. Each Github organization on my network has a master APIs.json, providing include links to all other APIs.json within the organization. In this scenario I have over 30 other APIs.json indexed, which can all operate independently of each other, but can also be considered a collection of API collections.

  • Master - A master collection of API collections I maintain as part of the API Evangelist network operations.

The First Open Source Tooling For APIs.json
Up until now, this post is all about APIs.json, where in reality the format is useless without their being any tooling built on top of the specification, bringing value to the table. This is why the 3Scale team got to work building an open source APIs.json driven search engine:

  • APIs.io as an open source tool dedicated to APIs.json
  • APIs.io as a public API search engine, with APIs.json as index.
  • APIs.io as a private API search engine, with APIs.json as index.

APIs.json Driving Other Open Tooling
APIs.io is just the beginning. It won't be enough to convince all API providers that they should be producing APIs.json index of their site operations, just for the API discovery boost. We are going to need APIs.json driven tooling that will service every other stop along the life cycle, including:

  • HTTP Client / Hub / Workbenches
  • Documentation
  • Testing
  • Monitoring
  • Virtualization
  • Visualization

APIs.json Integrated Into Existing Platforms
What areas would you like to see served? Personally, I would like to have the ability to load / unload my APIs.json collections into any service that I use. Allowing me to organize my internal, public, and 3rd party APIs I depend within any platform out there that is servicing the API space. Here are a handful of those types of integrations that are already happening:

APIs.json Linking To The Human Aspects Of API Operations
APIs.json is just the scaffolding to hang links to essential aspects of your operations, it doesn't care what you link to. You can start by referencing essential links for your API operations like:

  • Signup - How to signup for a service.
  • Support - Where to get support. 
  • Terms of Service - Where are the terms of service.
  • Pricing - Where to find the pricing for a service.

APIs.json Linking to Machine Readable Aspects of API Operations
These do not have to be machine readable links, they can reference important things the humans will need first. However, ultimately the goal is to make as much of the APIs.json index as machine readable as possible, using a variety of existing API definition formats, available for a variety of purposes.

Defining New, Machine Readable Property Elements For APIs.json
While the APIs.json spec will evolve, something I talk about below, its real strength lies in its ability to incentivize the development of entirely new, machine readable API definitions, bringing even more value to the API discovery process. Here are a few of the additional specs being crafted independent of, but inspired by APIs.json:

  • API Plans, for pricing, plans & rate limits.
  • API Monitoring, for monitoring & testing.
  • API Changelog, for operational monitoring.
  • API SDK, for SDK reference.
  • API Conversations - for the stream around API operations

Roadmap for Version 0.16 of APIs.json
That is the 100K view of what is APIs.json now, and the short term plan for the future. Most of the change within the universe APIs.json is mapping will occur add the individual API, and within the machine readable specs that describe them like OpenAPI Spec, API Blueprint, and Postman. Secondarily, there will be additional, machine readable, API types being defined and added into the spec.

Even with this reality, we do have a handful of changes planned for the 0.16 version of APIs.json:

  • commons - Establish a top level collection of common property elements that apply to ALL APIs being referenced in an APIs.json
  • country - Adding a top level country reference using ISO 3166.
  • New Proper Elements - Suggesting a handful of new property elements to reference common API operation building blocks
    • Registration
    • Blog
    • Github
    • Twitter

I doubt we will see many new additions like commons and country. In the future most of the structural changes to APis.json will be derived from first class property elements (ie. adding documentation or Github), making this the proving ground for defining what are truly the most important aspects of API operations, and what should be machine readable vs human readable.

The Hard Work That Lies Ahead for APIs.json
That concludes defining what is APIs.json, and what is next for APIs.json. Now we really have to get to work, doing the heavy lifting around:

  • Getting more API providers to describe their API operations using APIs.json, and publish in the root of the domain for their API ecosystem.
  • Encourage more API evangelists, brokers & analysts using to describe their collections, using APIs.json, building more meaningful indexes and directories of high value APIs.
  • Encourage platforms to build APIs.json into their operations, as a storage and organization schema, but also as import / export format.
  • Incentivize the development of more meaningful tooling that employs APIs.json, and uses it to better serve the API life cycle.
  • Continue to add new API property elements, making sure as many of them as possible evolve to be machine readable, as well as first class citizens in the APIs.json specification.

You can stay involved with what we are up to via the APIs.json website, and the APIs.json Github repository. You can also stay in tune with what is going on with APis.io via the website, and its Github repository. If you are doing something with APIs.json, ranging from using it as an index for your API operations, to platform integrations, please let me know. Also, if you envision some interesting tooling you'd like to see happen, make sure and submit a Github issue letting us know

While we still have huge amounts of work to do, when it comes to delivering meaningful API discovery solutions that the industry can put to work, I am pretty stoked with what we have managed to do over the last two years of work on the APIs.json specification, and supporting tooling--momentum that I feel picking up in 2016.


Solution Discovery Instead of API Discovery Via API Aggregation and Reciprocity Providers

During my API discovery session talk at @APIStrat Austin this last November, I talked about what I see as an added dimension to the concept of API discovery, one that will become increasingly important when it comes to actually moving things forward --- discovering solutions that are API driven vs. API discovery, where a developer is looking for an API. 

It might not seem that significant to developers, but SaaS services like Zapier, DataFire, and API hubs like Cloud Elements, bring this critical new dimension to how people actually will find your APIs. As nice as ProgrammableWeb has been for the last 10 years, we have to get more sophisticated about how we get our APIs in front of would-be consumers. We just can't depend on everyone who will put our API to work, immediately thinking that they need an API--most likely they are just going to need a solution to their problem, and secondarily need to understand there is an API driving things behind the scenes.

Of of many examples of this in the wild, could be in the area of tech support for your operations. Maybe you use Jira currently, because this is what your development team uses, but with a latest release you need something a little more public facing. When you are exploring what is possible with API reciprocity services like Zapier, and API hubs like Cloud Elements, you get introduced to other API driven solutions like Zendesk, or Desk.com from SalesForce.

This is just one example of how APIs can make an impact on the average business user, and will be the way API discovery happens in the future. In this scenario, I didn't set out looking for an API, but because I use API enabled service providers, I am introduced to other alternative solutions that might also help me tackle the problem I need. I may never have even known SalesForce had a help desk solution, if I wasn't already exploring the solutions Cloud Elements brings to the table.

As an API provider, you need to make sure your APIs are available via the growing number of API aggregation and reciprocity providers, and make sure the solutions they bring to the table are easily discoverable. You need to think beyond the classic developer focused version of API discovery, and make sure and think about API driven solution discovery meant for the average business or individual user.

Disclosure: Cloud Elements is an API Evangelist partner.


Evolving My API Stack To Be A Public Repo For Sharing API Discovery, Monitoring, And Rating Information

My API Stack began as a news site, and evolved into a directory of the APIs that I monitor in the space. I published APIs.json indexes for the almost 1000 companies I am trackig on, with almost 400 OADF files for some of the APIs I've profiled in more detail. My mission around the project so far, has been to create an open source, machine readable repo for the API space.

I have  had two recent occurrences that are pushing me to expand on my API Stack work. First, I have other entities who want to contribute monitoring data and other elements I would like to see collected, but haven't had time. The other is I that I have started spidering the URLs of the API portals I track on, and need a central place to store the indexes, so that others can access.

Ultimately I'd like to see the API Stack act as a public repo, where anyone can grab the data they need to discovery, evaluate, integrate, and stay in tune with what APIs are doing, or not doing. In addition to finding OADF, API Blueprint, and RAML files by crawling and indexing API portals, and publishing in a public repo, I want to build out the other building blocks that I index with APIs.json, like pricing, and TOS changes, and potentially monitoring, testing, performance data available.

Next I will publish some pricing, monitoring, and portal site crawl indexes to the repo, for some of the top APIs out there, and start playing with the best way to store the JSON, and other files, and provide an easy way explore and play with the data. If you have any data that you are collecting, and would like to contribute, or have a specific need you'd like to see tracked on, let me know, and I'll add to the road map.

My goal is to go for quality and completeness of data there, before I look to scale, and expand the quantity of information and tooling available. Let me know if you have any thoughts or feedback.


An Overview Of API Discovery From @APIStrat

I am delivering my API discovery talk from @APIStrat in Austin tomorrow AM. It will be via a Google Hangout, beginning at 8:00 AM PST. Jim Laredo of IBM's API Harmony, Jerome Louvel of Restlet, and Nicolas Grenie of 3Scale and APIs.io will be coming together for the hangout, with Natalie Kerns of Cloud Elements helping moderate again. 

To prepare for the hangout, and I wanted to revist my talk, a perfect opportunity to build off the momentum from the event, and share the story on API Evangelist. You can find the slide deck from my original talk on my talks.kinlane.com project site, and I will post the video from the Google Hangout her on this blog, when we are done tomorrow.

 

Other Emerging Solutions

Other Emerging Solutions

API Discovery via Relationships

Discovery Via Integrated Development Environments (IDE)

API Discovery At Client Runtime

What APIs Exist Publicly?

What APIs Exist Privately?

What APIs that Should Exist?

How APIs Are Being Used?

Focus On Finding Just The Right API

The Aggregation, Integration, and Interoperability of APIs

Meaningful Procesess and Reciprocity Via APIs

Discovery Goes Well Beyond Just Developers Finding APIs


Providing API.json As A Discovery Media Type Every One Of My API Endpoints

It can be easy to stumble across the base URL for one of my APIs out on the open Internet. I design my APIs to be easily distributed, shared, and as accessible as possible--based upon what I feel the needs for the resource might be. You can find most of my APIs, as part of my master stack, but there are other APIs like my screen capture API, or maybe my image manipulation API, that are often orphaned, which I know some people could use some help identifying more of the resources that are behind API operations.

To help support discovery across my network of APIs, I'm going to be supporting requests for Content-Type: application/apis+json for each endpoint, as well as an apis.json file in the root of the API, and supporting portal. An example of this in action, can be seen with my blog API, where you can look into the root of the portal for API (kin-lane.github.io/blog/apis.json), and in the root of the base URL for the API (blog.api.kinlane.com/apis.json), and for each individual endpoint, like the (blog.api.kinlane.com/blog/) endpoint, you can request the Content-Type: application/apis+json, and get a view of the APIs.json discovery file.

It will take me a while to this rolled out across all of my APIs, I have worked out the details on my blog, and API APIs. Providing discovery at the portal, API, and endpoint level just works. It provides not just access to documentation, but the other critical aspects of API operations, in a machine readable way, wherever you need it. It is nice to be on the road to having APIs.json exist as the media type (application/apis+json), something that isn't formal yet, but we are getting much closer with the latest release, and planned releases.

Next, I will push out across all my APIs, and do another story to capture what things look at that point. Hopefully it is something I can encourage others to do eventually, making API discovery a little more ubiquitous across API operations.


My API Discovery Research

I am giving each of my primary API research sites a refresh, and first up is the home page of my API discovery research. As I update each home page, I'm going to publish here on API Evangelist to help bring more awareness to each of the main areas I'm studying.

This is one of my API research sites, focused specifically on API discovery. My name is Kin Lane, and I am the API Evangelist, working as hard as I can to understand the world of the Application Programming Interfaces, widely called an API. This network of API research projects all run on Github, and is my real-time workbench, which means there is a lot of finished work present, but occasionally you will also come across areas projects that are unfinished--you have stumbled in my API discovery research, you will find the main API Evangelist site over here, with other links to my work.

This site is where I publish news that I have read, stories I’ve published, company I’ve profiled, and valuable tools I’m stumbled on across while researching API discovery. Finding APIs, and having your APIs found is a pretty significant pain point, and in the last ten years, little has been done to provide adequate solutions. API discovery is an area I've been monitoring, ultimately have been unsatisfied with what I've seen, I've take matters into my own hands, and created APIs.json, a machine readable API discovery format, any API provider can use to describe their APIs.

There are several API management companies, including Apigee, Mashery, and Mashape who have put forth API discovery solutions, and while these offerings are valuable, I feel they lack the openness necessary to truly move the API space, and the API discovery conversation forward. APIs.json was created to help make sense of the API space, in partnership with 3Scale API management infrastructure, CEO Steve Willmott (@njyx). Together we are working to define as much of the public APIs space as possible using APIs.json, and encourage the development of open tooling around the format, with the first major addition being an open source API search engine called, APIs.io

Next I am working with other providers like Socrata, WSO2, and others to develop additional tooling, to serve all levels of the API sector. I do not feel API discovery is something that simply happens via an API directory like ProgrammableWeb, or even via API search engines like APIs.io--in the future API discovery will also occur in the browser, via our IDEs, and seamlessly within modern clients. In 2015, you'll see many more APIs.json driven efforts to help alleviate API discovery pain, and we hope the format, and supporting open tooling will stimulate other complimentary, or even competing efforts. API discovery is long overdue for some investment by the community, helping us get past some of our PTSD of the SOA era--we can do it!

All my research is openly licensed CC-BY, and is meant to help grow the awareness around healthy API discovery practices. I try to be as fair as I can when covering companies, individuals, and the tools they provide, but ultimately you will notice I have my favorites, and there are some areas I only touch on lightly, for a variety of personal reasons. I try to stay as neutral as I can when it comes to technological dogma, and company allegiance, but after almost five years, I have some pretty strong opinions, and can’t help but try and steer, and influence things in my own unique way. ;-)


Using APIs.json For My Microservice Navigation And Discovery

I’m rebuilding my underlying architecture using microservices and docker containers, and the glue I’m using to bind it all together is APIs.json. I’m not just using APIs.son to deliver on discoverability for all of my services, I am also using it to navigate around my stack. Right now I only have about 10 microservices running, but I have a plan to add almost 50 in total by the time I’m done with this latest sprint.

Each microservice lives as its own Github repository, within a specific organization. I give each one its own APIs.json, indexing all the elements APIs of that specific microservice. APIs.json has two main collections, "apis" and "include". For each microservice APIs.json, I list all the properties for that API, but I use the include element to document the urls of other microservice APIs.json in the collection.

All the Github repositories for this microservice stack lives within a single Github organization, which I give a "master" repo, which acts as a single landing page for the entire stack. It has its own APIs.json file, but rather than having any API collections, it just uses includes, referencing the APIs.json for each microservice in the stack.

APIs.json acts as an index for each microservice, but through the include collection it also provides links to other related microservices within its own stack, which I use to navigate, in a circular way between all supporting services. All of that sounds very dizzying to write out, and I’m sure you are like WTF? You can browse my work on Github, some of it is public, but much of it you have to have oAuth access to see. The public elements all live in the gh-pages branch, while the private aspects live within the private master branch.

This is all a living workbench for me, so expect broken things. If you have questions, or would like more access to better understand, let me know. I’m happy to consider adding you to the Github organization as collaborator so that you can see more of it in action. I will also chronicle my work here on the blog, as I have time, and have anything interesting things to share.


Do You Know That Hypermedia Is A Better Solution For Discovery Than APIs.json?

I spend a lot of time field questions from people about APIs.json. This is something I expect to be doing for the next 10 years, and happy to field questions about exactly what it is all about, and help educate folks about exactly where APIs.json it fits in to the overall API landscape.

A regular comment I get from technologists, and API savvy folks is “you know that hypermedia is a better solution for discovery than APIs.json?” To which I reply “yes I know, but hypermedia is a solution for the API world we want, and APIs.json is a solution for the world we have”. I would love it if everyone understood hypermedia, and designed and deployed their APIs with this knowledge in mind—something I’ve spending a lot more time on in 2014, and will continue the push in 2015.

In the mean time, I intend to continue stitching apitogether thousands of APIs available out there, using APIs.json, allowing them to be discovered via open source public search engines like APIs.io, private internal search engines, and IDE solutions like Codenvy. We don’t just need discovery solutions to find new APIs, we need ways to build collections of high value APIs, something APIs.json excels at.

Another thing that APIs.json does, that I don’t feel that hypermedia provides a solution to, is the discovery of the other technical, business, and political building blocks like the documentation, code libraries, SDKs, terms of service, and other critical elements of API operations. This an area that APIs.json was specifically defined for, allowing the discovery of these vital areas, in addition to the actual API interfaces.

As I do with many of my posts, this one is crafted to be a cookie cutter response to these comments I get regularly about hypermedia vs. APIs.json, allowing me to just reply with a link to this story on my blog. Hopefully we can shift the tide of API deployments in favor of hypermedia, but the APIs we can’t do this with, APIs.json is a healthy alternative, and I even recommend using it alongside hypermedia to provide easy access to the other critical building blocks of API operations, that technologists often overlook.


API Discovery Continues Its Move Into The IDE With Eclipse Che

Another layer to the release of Codenvy this week, was the announcement of the Eclipse project Che, an open source "project to create a platform for SAAS developer environment that contains all of the tools, infrastructure, and processes necessary for a developer to edit, build, test, and debug an application”. Che represents the next generation IDE that runs in the cloud, which coincides with other signs I've seen around API discovery moving into the IDE with signals from API pioneers like SalesForce and Google, or from Microsoft in Visual Studio.

I’m still learning about Che, but I’m beginning to see two distinct ways Che and APIs can be employed. First lets start with the environment:

You can build extensions for Che, and when those extensions are compiled into the kernel, Che creates server-side microservices, a RESTful API, and cross-browser optimized JavaScript, which is then pluggable into a browser IDE. With Che, developers are given the pluggable structure of Eclipse, but accessible through a cloud environment.

This means you can orchestrate development workflows, with APIs. You can predefine, deploy, customize, maintain and orchestrate very modular development environment where everything can be controlled via API. What a perfect environment for orchestrating the next generation of application development.

Next I see APIs leading the development lifecycle as well, not just defining the development environment and process. Earlier stories I’ve done on API discovery via SDKs, showcase SaleForce and Google providing native discovery of their APIs directly in Eclipse. In an Eclipse Che driven development environment, you could define pre-built, or custom API stacks, bringing exactly the API resources developers will need and bake them directly into their IDE—meaning APIs find the developers, developers don’t have to find their own APIs.

This approach to API discovery via the IDE provides some various interesting opportunity for marrying with earlier thoughts I’ve had around being an API broker. I can envision a future where API evangelism evolves to a point where you don’t just represent one API, you can represent many APIs, and configure API fueled developer environments for building any type of application. Think of what Backend as a Service (BaaS) providers like Parse and Kinvey have been doing since 2011, but now think of pre-configured, or custom tailored IDE environments with exactly the resources you need.

I’m just getting started with Che, and Codenvy, so it will take me a while to work through my thoughts on using it as an API brokerage platform. The one thing you can count on me for, is that I will tell stories all along the way as I figure it out.

Disclosure: API Evangelist is an advisor to Codenvy.


An API Evangelism Strategy To Map The Global Family Tree

In my work everyday as the API Evangelist, I get to have some very interesting conversations, with a wide variety of folks, about how they are using APIs, as well as brainstorming other ways they can approach their API strategy allowing them to be more effective. One of the things that keep me going in this space is this diversity. One day I’m looking at Developer.Trade.Gov for the Department of Commerce, the next talking to WordPress about APIs for 60 million websites, and then I’m talking with the The Church of Jesus Christ of Latter-day Saints about the Family Search API, which is actively gathering, preserving, and sharing genealogical records from around the world.

I’m so lucky I get to speak with all of these folks about the benefits, and perils of APIs, helping them think through their approach to opening up their valuable resources using APIs. The process is nourishing for me because I get to speak to such a diverse number of implementations, push my understanding of what is possible with APIs, while also sharpening my critical eye, understanding of where APIs can help, or where they can possibly go wrong. Personally, I find a couple of things very intriguing about the Family Search API story:

  1. Mapping the worlds genealogical history using a publicly available API — Going Big!!
  2. Potential from participation by not just big partners, but the long tail of genealogical geeks
  3. Transparency, openness, and collaboration shining through as the solution beyond just the technology
  4. The mission driven focus of the API overlapping with my obsession for API evangelism intrigues and scares me
  5. Have existing developer area, APIs, and seemingly necessary building blocks but failed to achieve a platform level

I’m open to talking with anyone about their startup, SMB, enterprise, organizational, institutional, or government API, always leaving open a 15 minute slot to hear a good story, which turned into more than an hour of discussion with the Family Search team. See, Family Search already has an API, they have the technology in order, and they even have many of the essential business building blocks as well, but where they are falling short is when it comes to dialing in both the business and politics of their developer ecosystem to discover the right balance that will help them truly become a platform—which is my specialty. ;-)

This brings us to the million dollar question: How does one become a platform?

All of this makes Family Search an interesting API story. The scope of the API, and to take something this big to the next level, Family Search has to become a platform, and not a superficial “platform” where they are just catering to three partners, but nourishing a vibrant long tail ecosystem of website, web application, single page application, mobile applications, and widget developers. Family Search is at an important reflection point, they have all the parts and pieces of a platform, they just have to figure out exactly what changes need to be made to open up, and take things to the next level.

First, let’s quantify the company, what is FamilySearch? “ For over 100 years, FamilySearch has been actively gathering, preserving, and sharing genealogical records worldwide”, believing that “learning about our ancestors helps us better understand who we are—creating a family bond, linking the present to the past, and building a bridge to the future”.

FamilySearch is 1.2 billion total records, with 108 million completed in 2014 so far, with 24 million awaiting, as well as 386 active genealogical projects going on. Family Search provides the ability to manage photos, stories, documents, people, and albums—allowing people to be organized into a tree, knowing the branch everyone belongs to in the global family tree.

FamilySearch, started out as the Genealogical Society of Utah, which was founded in 1894, and dedicate preserving the records of the family of mankind, looking to "help people connect with their ancestors through easy access to historical records”. FamilySearch is a mission-driven, non-profit organization, ran by the The Church of Jesus Christ of Latter-day Saints. All of this comes together to define an entity, that possesses an image that will appeal to some, while leave concern for others—making for a pretty unique formula for an API driven platform, that doesn’t quite have a model anywhere else.

FamilySearch consider what they deliver as as a set of record custodian services:

  • Image Capture - Obtaining a preservation quality image is often the most costly and time-consuming step for records custodians. Microfilm has been the standard, but digital is emerging. Whether you opt to do it yourself or use one of our worldwide camera teams, we can help.
  • Online Indexing - Once an image is digitized, key data needs to be transcribed in order to produce a searchable index that patrons around the world can access. Our online indexing application harnesses volunteers from around the world to quickly and accurately create indexes.
  • Digital Conversion - For those records custodians who already have a substantial collection of microfilm, we can help digitize those images and even provide digital image storage.
  • Online Access - Whether your goal is to make your records freely available to the public or to help supplement your budget needs, we can help you get your records online. To minimize your costs and increase access for your users, we can host your indexes and records on FamilySearch.org, or we can provide tools and expertise that enable you to create your own hosted access.
  • Preservation - Preservation copies of microfilm, microfiche, and digital records from over 100 countries and spanning hundreds of years are safely stored in the Granite Mountain Records Vault—a long-term storage facility designed for preservation.

FamilySearch provides a proven set of services that users can take advantage of via a web applications, as well as iPhone and Android mobile apps, resulting in the online community they have built today. FamilySearch also goes beyond their basic web and mobile application services, and is elevated to software as a service (SaaS) level by having a pretty robust developer center and API stack.

Developer Center
FamilySearch provides the required first impression when you land in the FamilySearch developer center, quickly explaining what you can do with the API, "FamilySearch offers developers a way to integrate web, desktop, and mobile apps with its collaborative Family Tree and vast digital archive of records”, and immediately provides you with a getting started guide, and other supporting tutorials.

FamilySearch provides access to over 100 API resources in the twenty separate groups: Authorities, Change History, Discovery, Discussions, Memories, Notes, Ordinances, Parents and Children, Pedigree, Person, Places, Records, Search and Match, Source Box, Sources, Spouses, User, Utilities, Vocabularies, connecting you to the core FamilySearch genealogical engine.

The FamilySearch developer area provides all the common, and even some forward leaning technical building blocks:

To support developers, FamilySearch provides a fairly standard support setup:

To augment support efforts there are also some other interesting building blocks:

Setting the stage for FamilySearch evolving to being a platform, they even posses some necessary partner level building blocks:

There is even an application gallery showcasing what web, mac & windows desktop, and mobile applications developers have built. FamilySearch even encourages developers to “donate your software skills by participating in community projects and collaborating through the FamilySearch Developer Network”.

Many of the ingredients of a platform exist within the current FamilySearch developer hub, at least the technical elements, and some of the common business, and political building blocks of a platform, but what is missing? This is what makes FamilySearch a compelling story, because it emphasizes one of the core elements of API Evangelist—that all of this API stuff only works when the right blend of technical, business, and politics exists.

Establishing A Rich Partnership Environment

FamilySearch has some strong partnerships, that have helped establish FamilySearch as the genealogy service it is today. FamilySearch knows they wouldn’t exist without the partnerships they’ve established, but how do you take it to the next and grow to a much larger, organic API driven ecosystem where a long tail of genealogy businesses, professionals, and enthusiasts can build on, and contribute to, the FamilySearch platform.

FamilySearch knows the time has come to make a shift to being an open platform, but is not entirely sure what needs to happen to actually stimulate not just the core FamilySearch partners, but also establish a vibrant long tail of developers. A developer portal is not just a place where geeky coders come to find what they need, it is a hub where business development occurs at all levels, in both synchronous, and asynchronous ways, in a 24/7 global environment.

FamilySearch acknowledge they have some issues when it comes investing in API driven partnerships:

  • “Platform” means their core set of large partners
  • Not treating all partners like first class citizens
  • Competing with some of their partners
  • Don’t use their own API, creating a gap in perspective

FamilySearch knows if they can work out the right configuration, they can evolve FamilySearch from a digital genealogical web and mobile service to a genealogical platform. If they do this they can scale beyond what they’ve been able to do with a core set of partners, and crowdsource the mapping of the global family tree, allowing individuals to map their own family trees, while also contributing to the larger global tree. With a proper API driven platform this process doesn’t have to occur via the FamiliySearch website and mobile app, it can happen in any web, desktop, or mobile application anywhere.

FamilySearch already has a pretty solid development team taking care of the tech of the FamilySearch API, and they have 20 people working internally to support partners. They have a handle on the tech of their API, they just need to get a handle on the business and politics of their API, and invest in the resources that needed to help scale the FamilySearch API being just a developer area, to being a growing genealogical developer community, to a full blow ecosystem that span not just the FamilySearch developer portal, but thousands of other sites and applications around the globe.

A Good Dose Of API Evangelism To Shift Culture A Bit

A healthy API evangelism strategy brings together a mix of business, marketing, sales and technology disciplines into a new approach to doing business for FamilySearch, something that if done right, can open up FamilySearch to outside ideas, and with the right framework manage to allow the platform to move beyond just certification, and partnering to also investment, and acquisition of data, content, talent, applications, and partners via the FamilySearch developer platform.

Think of evangelism as the grease in the gears of the platform allowing it to grow, expand, and handle a larger volume, of outreach, and support. API evangelism works to lubricate all aspects of platform operation.

First, lets kick off with setting some objectives for why we are doing this, what are we trying to accomplish:

  • Increase Number of Records - Increase the number of overall records in the FamilySearch database, contributing the larger goals of mapping the global family tree.
  • Growth in New Users - Growing the number of new users who are building on the FamilySearch API, increase the overall headcount fro the platform.
  • Growth In Active Apps - Increase not just new users but the number of actual apps being built and used, not just counting people kicking the tires.
  • Growth in Existing User API Usage - Increase how existing users are putting the FamilySearch APIs. Educate about new features, increase adoption.
  • Brand Awareness - One of the top reasons for designing, deploying and managing an active APIs is increase awareness of the FamilySearch brand.
  • What else?

What does developer engagement look like for the FamilySearch platform?

  • Active User Engagement - How do we reach out to existing, active users and find out what they need, and how do we profile them and continue to understand who they are and what they need. Is there a direct line to the CRM?
  • Fresh Engagement - How is FamilySearch contacting new developers who have registered weekly to see what their immediate needs are, while their registration is fresh in their minds.
  • Historical Engagement - How are historical active and / or inactive developers being engaged to better understand what their needs are and would make them active or increase activity.
  • Social Engagement - Is FamilySearch profiling the URL, Twitter, Facebook LinkedIn, and Github profiles, and then actively engage via these active channels?

Establish a Developer Focused Blog For Storytelling

  • Projects - There are over 390 active projects on the FamilySearch platform, plus any number of active web, desktop, and mobile applications. All of this activity should be regularly profiled as part of platform evangelism. An editorial assembly line of technical projects that can feed blog stories, how-tos, samples and Github code libraries should be taking place, establishing a large volume of exhaust via the FamlySearch platform.
  • Stories - FamilySearch is great at writing public, and partner facing content, but there is a need to be writing, editing and posting of stories derived from the technically focused projects, with SEO and API support by design.
  • Syndication - Syndication to Tumblr, Blogger, Medium and other relevant blogging sites on regular basis with the best of the content.

Mapping Out The Geneology Landscape

  • Competition Monitoring - Evaluation of regular activity of competitors via their blog, Twitter, Github and beyond.
  • Alpha Players - Who are the vocal people in the genealogy space with active Twitter, blogs, and Github accounts.
  • Top Apps - What are the top applications in the space, whether built on the FamilySearch platform or not, and what do they do?
  • Social - Mapping the social landscape for genealogy, who is who, and who should the platform be working with.
  • Keywords - Established a list of keywords to use when searching for topics at search engines, QA, forums, social bookmarking and social networks. (should already be done by marketing folks)
  • Cities & Regions - Target specific markets in cities that make sense to your evangelism strategy, what are local tech meet ups, what are the local organizations, schools, and other gatherings. Who are the tech ambassadors for FamilySearch in these spaces?

Adding To Feedback Loop From Forum Operations

  • Stories - Deriving of stories for blog derived from forum activity, and the actual needs of developers.
  • FAQ Feed - Is this being updated regularly with stuff?
  • Streams - other stream giving the platform a heartbeat?

Being Social About Platform Code and Operations With Github

  • Setup Github Account - Setup FamilySearch platform developer account and bring internal development team into a team umbrella as part of.
  • Github Relationships - Managing of followers, forks, downloads and other potential relationships via Github, which has grown beyond just code, and is social.
  • Github Repositories - Managing of code sample Gists, official code libraries and any samples, starter kits or other code samples generated through projects.

Adding To The Feedback Loop From The Bigger FAQ Picture

  • Quora - Regular trolling of Quora and responding to relevant [Client Name] or industry related questions.
  • Stack Exchange - Regular trolling of Stack Exchange / Stack Overflow and responding to relevant FamilySearch or industry related questions.
  • FAQ - Add questions from the bigger FAQ picture to the local FamilySearch FAQ for referencing locally.

Leverage Social Engagement And Bring In Developers Too

  • Facebook - Consider setting up of new API specific Facebook company. Posting of all API evangelism activities and management of friends.
  • Google Plus - Consider setting up of new API specific Google+ company. Posting of all API evangelism activities and management of friends.
  • LinkedIn - Consider setting up of new API specific LinkedIn profile page who will follow developers and other relevant users for engagement. Posting of all API evangelism activities.
  • Twitter - Consider setting up of new API specific Twitter account. Tweeting of all API evangelism activity, relevant industry landscape activity, discover new followers and engage with followers.

Sharing Bookmarks With the Social Space

  • Hacker News - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Hacker News, to keep a fair and balanced profile, as well as network and user engagement.
  • Product Hunt - Product Hunt is a place to share the latest tech creations, providing an excellent format for API providers to share details about their new API offerings.
  • Reddit - Social bookmarking of all relevant API evangelism activities as well as relevant industry landscape topics to Reddit, to keep a fair and balanced profile, as well as network and user engagement.

Communicate Where The Roadmap Is Going

  • Roadmap - Provide regular roadmap feedback based upon developer outreach and feedback.
  • Changelog - Make sure the change log always reflects the roadmap communication or there could be backlash.

Establish A Presence At Events

  • Conferences - What are the top conferences occurring that we can participate in or attend--pay attention to call for papers of relevant industry events.
  • Hackathons - What hackathons are coming up in 30, 90, 120 days? Which would should be sponsored, attended, etc.
  • Meetups - What are the best meetups in target cities? Are there different formats that would best meet our goals? Are there any sponsorship or speaking opportunities?
  • Family History Centers - Are there local opportunities for the platform to hold training, workshops and other events at Family History Centers?
  • Learning Centers - Are there local opportunities for the platform to hold training, workshops and other events at Learning Centers?

Measuring All Platform Efforts

  • Activity By Group - Summary and highlights from weekly activity within the each area of API evangelism strategy.
  • New Registrations - Historical and weekly accounting of new developer registrations across APis.
  • Volume of Calls - Historical and weekly accounting of API calls per API.
  • Number of Apps - How many applications are there.

Essential Internal Evangelism Activities

  • Storytelling - Telling stories of an API isn’t just something you do externally, what stories need to be told internally to make sure an API initiative is successful.
  • Conversations - Incite internal conversations about the FamilySearch platform. Hold brown bag lunches if you need to, or internal hackathons to get them involved.
  • Participation - It is very healthy to include other people from across the company in API operations. How can we include people from other teams in API evangelism efforts. Bring them to events, conferences and potentially expose them to local, platform focused events.
  • Reporting - Sometimes providing regular numbers and reports to key players internally can help keep operations running smooth. What reports can we produce? Make them meaningful.

All of this evangelism starts with a very external focus, which is a hallmark of API and developer evangelism efforts, but if you notice by the end we are bringing it home to the most important aspect of platform evangelism, the internal outreach. This is the number one reason APIs fail, is due to a lack of internal evangelism, educating top and mid-level management, as well as lower level staff, getting buy-in and direct hands-on involvement with the platform, and failing to justify budget costs for the resources needed to make a platform successful.

Top-Down Change At FamilySearch

The change FamilySearch is looking for already has top level management buy-in, the problem is that the vision is not in lock step sync with actual platform operations. When regular projects developed via the FamilySearch platform are regularly showcased to top level executives, and stories consistent with platform operations are told, management will echo what is actually happening via the FamilySearch. This will provide a much more ongoing, deeper message for the rest of the company, and partners around what the priorities of the platform are, making it not just a meaningless top down mandate.

An example of this in action is with the recent mandate from President Obama, that all federal agencies should go “machine readable by default”, which includes using APIs and open data outputs like JSON, instead of document formats like PDF. This top down mandate makes for a good PR soundbite, but in reality has little affect on the ground at federal agencies. In reality it has taken two years of hard work on the ground, at each agency, between agencies, and with the public to even begin to make this mandate a truth at over 20 of the federal government agencies.

Top down change is a piece of the overall platform evolution at FamilySearch, but is only a piece. Without proper bottom-up, and outside-in change, FamilySearch will never evolve beyond just being a genealogical software as a service with an interesting API. It takes much more than leadership to make a platform.

Bottom-Up Change At FamilySearch

One of the most influential aspects of APIs I have seen at companies, institutions, and agencies is the change of culture brought when APIs move beyond just a technical IT effort, and become about making resources available across an organization, and enabling people to do their job better. Without an awareness, buy-in, and in some cases evangelist conversion, a large organization will not be able to move from a service orientation to a platform way of thinking.

If a company as a whole is unaware of APIs, either at the company or organization, as well as out in the larger world with popular platforms like Twitter, Instagram, and others—it is extremely unlikely they will endorse, let alone participate in moving from being a digital service to platform. Employees need to see the benefits of a platform to their everyday job, and their involvement cannot require what they would perceive as extra work to accomplish platform related duties. FamilySearch employees need to see the benefits the platform brings to the overall mission, and play a role in this happening—even if it originates from a top-down mandate.

Top bookseller Amazon was already on the path to being a platform with their set of commerce APIs, when after a top down mandate from CEO Jeff Bezos, Amazon internalized APIs in such a way, that the entire company interacted, and exchange resources using web APIs, resulting in one of the most successful API platforms—Amazon Web Services (AWS). Bezos mandated that if an Amazon department needed to procure a resource from another department, like server or storage space from IT, it need to happen via APIs. This wasn’t a meaningless top-down mandate, it made employees life easier, and ultimately made the entire company more nimble, and agile, while also saving time and money. Without buy-in, and execution from Amazon employees, what we know as the cloud would never have occurred.

Change at large enterprises, organizations, institutions and agencies, can be expedited with the right top-down leadership, but without the right platform evangelism strategy, that includes internal stakeholders as not just targets of outreach efforts, but also inclusion in operations, it can result in sweeping, transformational changes. This type of change at a single organization can effect how an entire industry operates, similar to what we’ve seen from the ultimate API platform pioneer, Amazon.

Outside-In Change At FamilySearch

The final layer of change that needs to occur to bring FamilySearch from being just a service to a true platform, is opening up the channels to outside influence when it comes not just to platform operations, but organizational operations as well. The bar is high at FamilySearch. The quality of services, and expectation of the process, and adherence to the mission is strong, but if you are truly dedicated to providing a database of all mankind, you are going to have to let mankind in a little bit.

FamilySearch is still the keeper of knowledge, but to become a platform you have to let in the possibility that outside ideas, process, and applications can bring value to the organization, as well as to the wider genealogical community. You have to evolve beyond notions that the best ideas from inside the organization, and just from the leading partners in the space. There are opportunities for innovation and transformation in the long-tail stream, but you have to have a platform setup to encourage, participate in, and be able to identify value in the long-tail stream of an API platform.

Twitter is one of the best examples of how any platform will have to let in outside ideas, applications, companies, and individuals. Much of what we consider as Twitter today was built in the platform ecosystem from the iPhone and Android apps, to the desktop app TweetDeck, to terminology like the #hashtag. Over the last 5 years, Twitter has worked hard to find the optimal platform balance, regarding how they educate, communicate, invest, acquire, and incentives their platform ecosystem. Listening to outside ideas goes well beyond the fact that Twitter is a publicly available social platform, it is about having such a large platform of API developers, and it is impossible to let in all ideas, but through a sophisticated evangelism strategy of in-person, and online channels, in 2014 Twitter has managed to find a balance that is working well.

Having a public facing platform doesn’t mean the flood gates are open for ideas, and thoughts to just flow in, this is where service composition, and the certification and partner framework for FamilySearch will come in. Through clear, transparent partners tiers, open and transparent operations and communications, an optimal flow of outside ideas, applications, companies and individuals can be established—enabling a healthy, sustainable amount of change from the outside world.

Knowing All Of Your Platform Partners

The hallmark of any mature online platform is a well established partner ecosystem. If you’ve made the transition from service to platform, you’ve established a pretty robust approach to not just certifying, and on boarding your partners, you also have stepped it up in knowing and understanding who they are, what their needs are, and investing in them throughout the lifecycle.

First off, profile everyone who comes through the front door of the platform. If they sign up for a public API key, who are they, and where do they potentially fit into your overall strategy. Don’t be pushy, but understanding who they are and what they might be looking for, and make sure you have a track for this type of user well defined.

Next, quality, and certify as you have been doing. Make sure the process is well documented, but also transparent, allowing companies and individuals to quickly understand what it will take to certified, what the benefits are, and examples of other partners who have achieved this status. As a developer, building a genealogical mobile app, I need to know what I can expect, and have some incentive for investing in the certification process.

Keep your friends close, and your competition closer. Open the door wide for your competition to become a platform user, and potentially partner. 100+ year old technology company Johnson Controls (JCI) was concerned about what the competition might do it they opened up their building efficient data resources to the public via the Panoptix API platform, when after it was launched, they realized their competition were now their customer, and a partner in this new approach to doing business online for JCI.

When Department of Energy decides what data and other resource it makes available via Data.gov or the agencies developer program it has to deeply consider how this could affect U.S. industries. The resources the federal agency possesses can be pretty high value, and huge benefits for the private sector, but in some cases how might opening up APIs, or limiting access to APIs help or hurt the larger economy, as well as the Department of Energy developer ecosystem—there are lots of considerations when opening up API resources, that vary from industry to industry.

There are no silver bullets when it comes to API design, deployment, management, and evangelism. It takes a lot of hard work, communication, and iterating before you strike the right balance of operations, and every business sector will be different. Without knowing who your platform users are, and being able to establish a clear and transparent road for them to follow to achieve partner status, FamilySearch will never elevate to a true platform. How can you scale the trusted layers of your platform, if your partner framework isn’t well documented, open, transparent, and well executed? It just can’t be done.

Meaningful Monetization For Platform

All of this will take money to make happen. Designing, and executing on the technical, and the evangelism aspects I’m laying out will cost a lot of money, and on the consumers side, it will take money to design, develop, and manage desktop, web, and mobile applications build around the FamilySearch platform. How will both the FamilySearch platform, and its participants make ends meet?

This conversation is a hard one for startups, and established businesses, let alone when you are a non-profit, mission driven organization. Internal developers cost money, server and bandwidth are getting cheaper but still are a significant platform cost--sustaining a sale, bizdev, and evangelism also will not be cheap. It takes money to properly deliver resources via APIs, and even if the lowest tiers of access are free, at some point consumers are going to have to pay for access, resources, and advanced features.

The conversation around how do you monetize API driven resources is going on across government, from cities up to the federal government. Where the thought of charging for access to public data is unheard of. These are public assets, and they should be freely available. While this is true, think of the same situation, but when it comes to physical public assets that are owned by the government, like parks. You can freely enjoy many city, county, and federal parks, there are sometimes small fees for usage, but if you want to actually sell something in a public park, you will need to buy permits, and often share revenue with the managing agency. We have to think critically about how we fund the publishing, and refinement of publicly owned digital assets, as with physical assets there will be much debate in coming years, around what is acceptable, and what is not.

Woven into the tiers of partner access, there should always be provisions for applying costs, overhead, and even generation of a little revenue to be applied in other ways. With great power, comes great responsibility, and along with great access for FamilySearch partners, many will also be required to cover costs of compute capacity, storage costs, and other hard facts of delivering a scalable platform around any valuable digital assets, whether its privately or publicly held.

Platform monetization doesn’t end with covering the costs of platform operation. Consumers of FamilySearch APIs will need assistance in identify the best ways to cover their own costs as well. Running a successful desktop, web or mobile application will take discipline, structure, and the ability to manage overhead costs, while also being able to generate some revenue through a clear business model. As a platform, FamilySearch will have to bring to the table some monetization opportunities for consumers, providing guidance as part of the certification process regarding what are best practices for monetization, and even some direct opportunities for advertising, in-app purchases and other common approaches to application monetization and sustainment.

Without revenue greasing the gears, no service can achieve platform status. As with all other aspects of platform operations the conversation around monetization cannot be on-sided, and just about the needs of the platform providers. Pro-active steps need to be taken to ensure both the platform provider, and its consumers are being monetized in the healthiest way possible, bringing as much benefit to the overall platform community as possible.

Open & Transparent Operations & Communications

How does all of this talk of platform and evangelism actually happen? It takes a whole lot of open, transparent communication across the board. Right now the only active part of the platform is the FamilySearch Developer Google Group, beyond that you don’t see any activity that is platform specific. There are active Twitter, Facebook, Google+, and mainstream and affiliate focused blogs, but nothing that serves the platform, contributed to the feedback loop that will be necessary to take the service to the next level.

On a public platform, communications cannot all be private emails, phone calls, or face to face meetings. One of the things that allows an online service to expand to become a platform, then scale and grow into robust, vibrant, and active community is a stream of public communications, which include blogs, forums, social streams, images, and video content. These communication channels cannot all be one way, meaning they need to include forum and social conversations, as well as showcase platform activity by API consumers.

Platform communications isn’t just about getting direct messages answered, it is about public conversation so everyone shares in the answer, and public storytelling to help guide and lead the platform, that together with support via multiple channels, establishes a feedback loop, that when done right will keep growing, expanding and driving healthy growth. The transparent nature of platform feedback loops are essential to providing everything the consumers will need, while also bringing a fresh flow of ideas, and insight within the FamilySearch firewall.

Truly Shifting FamilySearch The Culture

Top-down, bottom-up, outside-in, with constantly flow of oxygen via vibrant, flowing feedback loop, and the nourishing, and sanitizing sunlight of platform transparency, where week by week, month by month someone change can occur. It won’t all be good, there are plenty of problems that arise in ecosystem operations, but all of this has the potential to slowly shift culture when done right.

One thing that shows me the team over at FamilySearch has what it takes, is when I asked if I could write this up a story, rather than just a proposal I email them, they said yes. This is a true test of whether or not an organization might have what it takes. If you are unwilling to be transparent about the problems you have currently, and the work that goes into your strategy, it is unlikely you will have what it takes to establish the amount of transparency required for a platform to be successful.

When internal staff, large external partners, and long tail genealogical app developers and enthusiasts are in sync via a FamilySearch platform driven ecosystem, I think we can consider a shift to platform has occurred for FamilySearch. The real question is how do we get there?

Executing On Evangelism

This is not a definitive proposal for executing on an API evangelism strategy, merely a blueprint for the seed that can be used to start a slow, seismic shift in how FamilySearch engages its API area, in a way that will slowly evolve it into a community, one that includes internal, partner, and public developers, and some day, with the right set of circumstances, FamilySearch could grow into robust, social, genealogical ecosystem where everyone comes to access, and participate in the mapping of mankind.

  • Defining Current Platform - Where are we now? In detail.
  • Mapping the Landscape - What does the world of genealogy look like?
  • Identifying Projects - What are the existing projects being developed via the platform?
  • Define an API Evangelist Strategy - Actually flushing out of a detailed strategy.
    • Projects
    • Storytelling
    • Syndication
    • Social
    • Channels
      • External Public
      • External Partner
      • Internal Stakeholder
      • Internal Company-Wide
  • Identify Resources - What resource currently exist? What are needed?
    • Evangelist
    • Content / Storytelling
    • Development
  • Execute - What does execution of an API evangelist strategy look like?
  • Iterate - What does iteration look like for an API evangelism strategy.
    • Weekly
    • Review
    • Repeat

AS with many providers, you don’t want to this to take 5 years, so how do you take a 3-5 year cycle, and execute in 12-18 months?

  • Invest In Evangelist Resources - It takes a team of evangelists to build a platform
    • External Facing
    • Partner Facing
    • Internal Facing
  • Development Resources - We need to step up the number of resources available for platform integration.
    • Code Samples & SDKs
    • Embeddable Tools
  • Content Resources - A steady stream of content should be flowing out of the platform, and syndicated everywhere.
    • Short Form (Blog)
    • Long Form (White Paper & Case Study)
  • Event Budget - FamilySearch needs to be everywhere, so people know that it exists. It can’t just be online.
    • Meetups
    • Hackathons
    • Conferences

There is nothing easy about this. It takes time, and resources, and there are only so many elements you can automate when it comes to API evangelism. For something that is very programmatic, it takes more of the human variable to make the API driven platform algorithm work. With that said it is possible to scale some aspects, and increase the awareness, presence, and effectiveness of FamilySearch platform efforts, which is really what is currently missing.

While as the API Evangelist, I cannot personally execute on every aspect of an API evangelism strategy for FamilySearch, I can provide essential planning expertise for the overall FamilySearch API strategy, as well as provide regular checkin with the team on how things are going, and help plan the roadmap. The two things I can bring to the table that are reflected in this proposal, is the understanding of where the FamilySearch API effort currently is, and what is missing to help get FamilySearch to the next stage of its platform evolution.

When operating within the corporate or organizational silo, it can be very easy to lose site of how other organizations, and companies, are approach their API strategy, and miss important pieces of how you need to shift your strategy. This is one of the biggest inhibitors of API efforts at large organizations, and is one of the biggest imperatives for companies to invest in their API strategy, and begin the process of breaking operations out of their silo.

What FamilySearch is facing demonstrates that APIs are much more than the technical endpoint that most believe, it takes many other business, and political building blocks to truly go from API to platform.


Low Hanging Fruit For API Discovery In The Federal Government

I looked through 77 of the developer areas for federal agencies, resulting in reviewing approximately 190 APIs. While the presentation of 95% of the federal government developer portals are crap, it makes me happy that about 120 of the 190 APIs (over 60%) are actually consumable web APIs, that didn't make me hold my nose and run out of the API area. 

Of the 190, only 13 actually made me happy for one reason or another:

Don't get me wrong, there are other nice implementations in there. I like the simplicity and consistency in APIs coming out of GSA, SBA, but overall federal APIs reflect what I see a lot in the private sector, some developer making a decent API, but their follow-through and launch severeley lacks what it takes to make the API successful. People wonder why nobody uses their APIs? hmmmmm....

A little minimalist simplicity in a developer portal, simple explanation of what an API does, interactive documentation w/ Swagger, code libraries, terms of service (TOS), wouild go a looooooooooooong way in making sure these government resources were found, and put to use. 

Ok, so where the hell do I start? Let's look through theses 123 APIs and see where the real low hanging fruit for demonstrating the potential of APIs.json, when it comes to API discovery in the federal government.

Let's start again with the White House (http://www.whitehouse.gov/developers):

Only one API made it out of the USDA:

Department of Commerce (http://www.commerce.gov/developer):

  • Census Bureau API - http://www.census.gov/developers/ - Yes, a real developer area with supporting building blocks. (Update, News,( App Gallery, Forum, Mailing List). Really could use interactive document though. There are urls, but not active calls. Would be way easier if you could play with data, before committing. (B)
  • Severe Weather Data Inventory - http://www.ncdc.noaa.gov/swdiws/ - Fairly basic interface, wouldn’t take much to turn into modern web API. Right now its just a text file, with a spec style documentation explaining what to do. Looks high value. (B)
  • National Climatic Data Center Climate Data Online Web Services - http://www.ncdc.noaa.gov/cdo-web/webservices/v2Oh yeah, now we are talking. That is an API. No interactive docs, but nice clean ones, and would be some work, but could be done. (A)
  • Environmental Research Division's Data Access Program - http://coastwatch.pfeg.noaa.gov/erddap/rest.html - Looks like a decent web API. Wouldn’t be too much to generate a machine readable definition and make into a better API area. (B)
  • Space Physics Interactive Data Resource Web Services - http://spidr.ngdc.noaa.gov/spidr/docs/SPIDR.REST.WSGuide.en.pdf - Well its a PDF, but looks like a decent web API. It would be some work but could turn into a decide API with Swagger specs. (B)
  • Center for Operational Oceanographic Products and Services - http://tidesandcurrents.noaa.gov/api/ - Fairly straightforward API, Simple. Wouldn’t be hard to generate interactive docs for it. Spec needed. (B)

Arlington Cemetary:

Department of Education:

  • Department of Education - http://www.ed.gov/developers - Lots of high value datasets. Says API, but is JSON file. Wouldn’t be hard to generate APIs for it all and make machine readable definitions. (B)

Energy:

  • Energy Information Administration - http://www.eia.gov/developer/ - Nice web API, simple clean presentation. Needs interactive docs. (B)
  • National Renewable Energy Laboratory - http://developer.nrel.gov/ - Close to a modern Developer area with web APIs. Uses standardized access (umbrella). Some of them have Swagger specs, the rest would be easy to create. (A)
  • Office of Scientific and Technical Information - http://www.osti.gov/XMLServices - Interfaces are pretty well designed, and Swagger specs would be straightforward. But docs are all PDF currently. (B)

Department of Health and Human Services (http://www.hhs.gov/developer):

Food and Drug Administration (http://open.fda.gov):

Department of Homeland Security (http://www.dhs.gov/developer):

Two losse cannons:

 Department of Interior (http://www.doi.gov/developer):

Department of Justice (http://www.justice.gov/developer):

Labor:

  • Department of Labor - http://developer.dol.gov/ - I love their developer area. They have a great API, easy to generate API definitions. (A)
  • Bureau of Labor Statistics - http://www.bls.gov/developers/ - Web APIs in there. Complex, and lots of work, but can be done. API Definitions Needed. (B)

Department of State (http://www.state.gov/developer):

Department of Transportation (http://www.dot.gov/developer):

Department of the Treasury (http://www.treasury.gov/developer):

Veterans Affairs (http://www.va.gov/developer):

Consumer Finance Protectection Bureau:

Federal Communications Commission (http://www.fcc.gov/developers):

Lone bank:

  • Federal Reserve Bank of St. Louis - http://api.stlouisfed.org/ - Good API and area, would be easy to generate API definitions. (B)

General Services Administration (http://www.gsa.gov/developers/):

National Aeronautics and Space Administration http://open.nasa.gov/developer:

Couple more loose cannons:

Recovery Accountability and Transparency Board (http://www.recovery.gov/arra/FAQ/Developer/Pages/default.aspx):

Small Business Administration (http://www.sba.gov/about-sba/sba_performance/sba_data_store/web_service_api):

Last but not least.

That is a lot of potentially valuable API resource to consume. From my perspective, I think that what has come out of GSA, SBA, and White House Petition API, represent probably the simplest, most consistent, and high value targets for me. Next maybe the wealth of APis out of Interior and FDA. AFter that I'll cherry pick from the list, and see which are easiest. 

I'm lookig to create a Swagger definition for each these APIs, and publish as a Github repository, allowing people to play with the API. If I have to, I'll create a proxy for each one, because CORS is not common across the federal government. I'm hoping to not spend to much time on proxies, because once I get in there I always want to improve the interface, and evolve a facade for each API, and I don't have that much time on my hands.


Applying APIs.json To API Discovery In The Federal Government

I recently updated my APIs.json files for all my API Evangelist network domains, to use version 0.14, which is getting pretty close to a stable version. While I await APIs.io to be updated to use this version, I wanted to to spend some time publishing APIs.json files, but this time across federal government APIs.

The thing I like most about APIs.json, is that you can do one for anybody else’s APIs. In the case of our federal government, I don't anticipate any agency getting on board with APIs.json anytime soon, but I can do it for them! There are a lot of APIs in federal government, where do I get started?

To help me understand the scope of API discovery in our federal government I looked through 77 developer portals, outlined by 18F. While browsing these developer portals for federal government agencies, I look at almost 190 APIs--with a goal of identifying the low hanging fruit, when it came to API discovery across hundreds of government APIs.

Out of the 190 APIs, around 120 of them were actual web APIs, that were something I felt I could work with. I settled on a handful of APIs out of the GSA, hosted at www.usa.gov, and explore.data.gov, and got to work creating APIs.json for their APIs.

Before I could generate an APIs.json at each of the two domains (www.usa.gov and explore.data.gov), I needed machine readable API definition for the four APIs. I purposely picked federal agency APIs that were REST(flu), and were something I could easily generate a Swagger definition for.

The federal agency domain API at explore.data.gov was pretty easy, ony taking me a few minutes to handcraft a Swagger definition. Then I moved on to the Federal Agency Directory API at www.usa.gov, and I was happy to see there was already a Swagger definition for the API. After that I tackled the Social Media Registry API, and Mobile App Gallery API, both of which I had to handcraft a Swagger definition for. The Mobile App Gallery API has a CORS issue, but I'm moving on and will setup a proxy to handle later.

Now I have four machine readable API definitions for some pretty valuable government APIs, and I got to work creating my APIs.json that would act as a directory for these API resources. APIs.json are designed to go into the root domain of any API provider, and the four GSA APIs I selected ran under two separate domains (www.usa.gov and explore.data.gov), so I needed two separate APIs.json files:

When APIs.io gets updated to the latest version of APIs.json, I will submit both of these APIs.json files for indexing. Even though I’m not the owner of these domains, I can still submit for inclusion in the API search engine. It would be better if the GSA published the APIs.json, and Swagger API definitions at the actual domains, and submitted themselves--the listings would then show as being authoritative, and hold more weight in API searches.

I still have about 115 more federal government APIs to create machine readable API definitions for, and the resulting APIs.json files that will enable discovery of these APIs--this isn’t something that will happen overnight, and will take a shit-ton of work.

My goal is to help harden the APIs.json format, while making APIs in federal government more accessible, and part of the larger API discovery conversation that is going on in the private sector. One of the powerful features of APIs.json, is that external actors can craft APIs.json collections, from the outside. You don’t have to be an owner of a domain, where an API is hosted, to publish an APIs.json on its behalf--I think this represents the potential of the public sector and private sector working together. 

APis.json are meant to work as collections of APis, or virtual stacks of valuable API resources. Sometimes these virtual stacks are defined internally, within a domain, and sometimes they are mashups of multiple API resources, across numerous domains, by outside actors, or API curators. I will keep curating federal government APIs, generating machine readable API definitions, APIs.json files for their supporting domains--helping define this fun new world of API discovery.


The Power In API Discovery For APIs.json Will Be In The API URL Type

An APIs.json file lives in the root of any domain, or subdomain, and provides references to a collection of API resources. The APIs.json is meant to be a lightweight framework, where someone can build a collection of APIs, give it a name, description, some tags, and the APIs collection points you where you need to go, to get more information about those APIs.

For each API, you can define a list of URLs, each with a defining “type”, letting you know what to expect when you visit the URL. Right now, most of those URLs are just for humans, pointing to the developer portal, document, and terms of service (TOS). We are adding other API url types, that API search engines like APIs.io can expose in their search interfaces, like code samples, and application gallery, to the next version of APIs.json.

These human API URL types provide a reference, that API search engines can use to guide human users who are searching for APIs. However, where the real power of APIs.json comes in, is when an API URL type references a machine readable source, like a Swagger definition, or an API Commons manifest. When it comes to API discovery, we need as many meaningful locations, that we can point human API consumers to, but also machine readable locations, that will help make API discovery much more rich, automated, and precise.

Imagine when I can do more than just search name, description, and tags, by keyword, much like APIs.io works currently. Imagine when you can specify that you only want APIs that are in the API Commons, and openly licensed. Imagine when I can search for APIs that allow me to use all my HTTP verbs, not just GET. Now, go even more in the future, when I can search for APIs who have a specific allowance in their terms of service, with a machine readable TOS type for APIs.json.

This is where I want to take APIs.json, and ultimately API discovery. Machine readable API definitions like API Blueprint, RAML, and Swagger are the very first API URL type that helps automate API discovery, and the API Commons manifest is the latest. My goal is to tackle terms of service, pricing, and other critical aspects of API integration, and push forward a machine readable definition, that can ultimately be baked into the API discovery process--in all API search engines.

What is important to me, is that the API URL types, in each APIs.json, remain independent, and only loosely coupled with the APIs.json format, using a simple label, and URL. I’m not directly invested in the evolution of each Swagger version, however I am with API Commons, and potentially other future APIs.json API URL type definitions. I want anyone to be able to step up and suggest API URL types for the APIs.json spec, making APIs.json URL types a community driven API discovery layer, for the API economy.


Expanding The Layer Of API Discovery From Within The Developers IDE

Much like API design and integration, the world of API discovery is heating up in 2014. We are moving beyond the API directory as our primary mode of API search, in favor of a distributed approach using APIs.json, and supporting open source search engines like APIs.io. Another area of API discovery I’ve been watching for a while, and predict will become an important layer of API discovery, will be via the Integrated Development Environment (IDE) plugin.

Open Source SalesForce API IDE Plugin
SalesForce just announced they have just open sourced their API IDE plugin on Github, after developing on it since 2007, when APEX was born. The plugin is old, but is very much in use in the SalesForce ecosystem, something I’ve written about before. They will be accepting pull requests on the main branch, looking to improve on the codebase, while looking to also maintain a community branch, as well as encouraging you to establish your own branch.

Does Your API Have An IDE Plugin?
How far along are you on your own APIs Eclipse Plugin? Are you trying to reach enterprise developers with your API resource? You should probably look at the pros and cons of providing your API developers with a plugin, for leading IDEs. With the open sourcing of SalesForce API IDE plugin, you can reverse engineer their approach and see what you can use for your own APIs IDE plugin—smells like a good opportunity to me.

Opportunity For General Or Niche API IDE Plugins
Not that using SalesForce open source IDE would be the place to start for this kind of project, but I think there is a huge opportunity to develop API focused IDE plugins, for top developed environments, across many popular APIs. Developers shouldn’t have to leave their development environments to find the resources they need, they should be able to have quick access to the APIs they depend on te most, and discover new API resources right from their local environment, aking IDE plugins an excellent API discovery opportunity.

Native Opportunities For IDE Platforms
I’ve seen a lot of new development environments emerge, many are web-based, with varying degrees of being “integrated”. I think that IDE developers can take a lead from Backend as a Service (BaaS) providers and build in the ability to define an integrated stack of API resources, right into a developer's web, mobile, or Iot development environment. If you are building a platform for developers to produce code, you should begin baking in API discovery and integration directly into your environment.

All I do as the API Evangelist, is shed light on what API pioneers like SalesForce are up to, and expand on their ideas, using my knowledge of the industry--resulting in these stories. SalesForce has been doing APIs for 14 years now, and the IDE has been part of their API driven ecosystem for the last seven years. I think their move to open source the technology, is an opportunity for the wider API space to run with, by helping improve the community SalesForce API IDE plugin, but also apply their experience, and legacy code to help evolve and improve on this layer of API discovery, available within the IDE.


Contributing To The Discovery Lifecycle

Contributing To The Discovery Lifecycle

One of the newest incentives for API providers to develop machine readable API definitions, is about API discovery. After many years of no search solution for APIs, beyond the standard API directory, we are just now beginning to see a new generation of API discovery tooling and services.

Together with 3Scale, I recently launched the APIs.json discovery format, looking to create a single API discovery framework that can live within the root of any domain. Our goal is to allow API providers to describe their APIs, providing machine readable pointers to the common building blocks of a company's API like signup, machine readable definitions, documentation, terms of service, and much, much more.

As we were developing APIs.json, we recognized that without a proper, distributed search engine, any machine readable API discovery formats would not be successful, and with this in mind 3Scale launched an open source API search engine, with the first implementation being APIs.io. As the number of APIs rapidly grows, more search solutions like APIs.io will be needed to make sure the most valuable APIs are discoverable, in real-time.

The future of API discovery will need more than just basic metadata to find APIs, we will need machine readable definitions that describe the API, as well as its supporting building blocks. API definitions will help automate the discovery and understanding of what value an API delivers, helping API consumers find just the API resources they need to make their applications successful.


Multiple Types of APIs.json For Discovery

I’m working through thoughts around a suggestion for future versions of APIs.json API discovery format, and as I do with other things I’m trying to make sense of, I wanted to write a blog post on API Evangelist. If you aren't familiar with the format, APIs.json is meant to be a machine readable JSON file that provides an overview and listing APIs available within a specific domain.

Authoritative APIs.json
This is an APIs.json that is made available in the root of a domain, that is providing detail on an API that is managed within the same domain. This use case is for API providers to list the APIs that they offer publicly.

Tribute APIs.json
There is an API you use, and want to see it indexed in an API search engine like APIs.io—so you create a tribute APIs.json. This APIs.json index is not done by the owner of the API, but by a fan, or outside contributor. Tributes will weave together the world of APIs, when providers do not have the time.

Facade APIs.json
There is an API you use, but doesn’t have exactly the interface you would want. Some resourceful API architects will be building facades to existing, external API resources. In the API economy you do not settle if an API isn't exactly what you need. The remix able nature of APIs allow for extending by designing facades that transform and extend existing API resources.

Cache APIs.json
If I learned anything working in the federal government last year, it is that APIs can go away at any point. In the future there will be a need for cached versions of existing APIs, providing redundancy and access to some important data and content.

Aggregate APIs.json
In fast growing areas of the API economy, we are seeing API aggregation trends, with sectors like social, media, cloud, financial, analytics, and other areas that have matured, and users are depending on potentially multiple API platforms.

Derived APIs.json
I envision derived APis as just an evolution of tribute or facade, stating that I started with a certain API design, but have evolved it beyond what it once was. Not acknowledging where we got our API patterns is a darker side of the API space that needs to go away—let’s be honest about where we learned our patterns and give a nod to these sources.

In the API economy, I think there will be multiple types of APIs that are deployed. As APIs proliferate, if the industry focuses on interoperability and reuse, there will be more types of APIs than the single API provider and APIs.json will help us keep tabs on this.

Not all API providers will have time, desire and the access to resources to publish their APIs.json, and the augmentation, replication and other possible derivatives that will emerge to organically expand on existing patterns.

Right now we are working on just stabilizing the latest release of APIs.json, so that people can get to work on publishing their own APIs.json. My goal with these thoughts is to just explore what is possible, and maybe, if successful some of these thoughts will be incorporated in future versions.


Solving The Problem Of API Discovery

API discovery has not changed much since 2005, when John Musser launched ProgrammableWeb, the API directory we've all come to know and love. In 2014 (9 years later), we have Mashape and a handful of other API directory and discovery tools, but we have not made progress on truly being able to discover the best APIs possible, in a distributed, machine-readable way.

Steve Willmott of 3Scale, and Kin Lane of API Evangelist, are at it again, looking to provide a possible solution, that we are calling APIs.json—a machine readable listing of your APIs and supporting building blocks that lives in the root of your domain.

The objective of APIs.json is to help fix this problem by making it easy for people to signpost where the APIs on a given domain are and provide information on how they work. The format is simple and extensible and can be put into any web root domain for discovery.

We are just getting started, so make sure and get involved via the Github repository or via the Google Group we've setup to facilitate discussion.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.