Unknown
  • Pushing the Needle Forward for AI Deployments

    At Datmo, we spend a lot of time working to build a better product for customers deploying AI to production, and part of that is ensuring that we are offering the most seamless developer experience possible. While each customer has a unique set of constraints between their model type and production needs, there is one thing consistent for everyone – people want to spend less time on deployments, and more time working on their models and experiments.
  • Deploying a machine learning model as an API with Datmo, Falcon, Gunicorn, and Python

    Learn how Datmo helps you wrangle the zoo of programming animals to create an easy-to-deploy machine learning API with an adaptable architecture. What’s an API? APIs (Application Program Interfaces) are software communication methods developed on a particular standard. Many companies have their own public API products that solve specific problems for developers —by passing their API a parameter as an input, the user can receive an output without needing to know (or understand) how the underlying task is done.
  • Demystifying the ML, AI, and Data Science development ecosystem (Part 1: Build)

    This blog post is part 1 of a 3 part series explaining the landscape of what we call the quantitative oriented developer (QoD) pipeline; encompassing machine learning, artificial intelligence, and data science. There are already many tutorials for getting started with individual modeling frameworks or algorithms, but explanations outlining how all of the moving parts fit together in these broader workflows are lacking.
  • Announcing the Datmo Applied AI/Machine Learning Fellowship

    Applications are now open for the first class of our new 6-week, projected oriented mentorship program. About the Opportunity The Datmo Applied AI/ML Fellowship is a project-based mentorship program designed to help aspiring AI/ML engineers hone their data modeling and technical communication skills through a project of their choosing, while receiving guidance and mentorship from industry experts.
  • Learning Data Science? You may already know the basics

    There are countless people trying to get into data science lately, but many are intimidated by the idea of learning a new workflow. In just the last year alone, the number of people reading about and interacting with machine learning has jumped from ~100k in 2016 to over 10 million in 2017 worldwide.
  • An Overview of Deploying Quantitative Workflows to Production

    Building and deploying machine learning models for enterprise use cases can be time consuming, tricky, and even political within an organization. The technology is exciting — predictive intelligence and classification, for example, have great disruptive potential and enable organizations to build competitive advantages with their own data.
  • Collaboration in Quantitative Workflows

    The other day two of our engineers, we’ll call them Alex and Ignacio, were trying to improve the accuracy of a facial recognition model. The main goal: get the model to recognize Donald Trump in images, videos, and gifs with as high confidence as possible. The sub-goal: Alex wanted Ignacio’s help in improving the model for practical use cases, specifically when being used on a set of 10,000 new labelled images they received from a customer.
  • Tracking and Reproducibility in Quantitative Workflows

    Documenting your work is necessary, but boring, regardless of the type of work you do. While tracking and reproducing work for most generic web-connected applications and workflows is becoming more standardized (i.e., document state-saving and tracking through Google Docs and code version control system like Github) there is currently no widely accepted standard or simple automation for data science and machine learning.
  • Quantitative Workflows: A New Paradigm for Everyone

    Topics like machine learning, artificial intelligence and data science have been talked about at length over the last few years. But these topics have been around for ages — albeit with names that have changed over the years. The types of problems that developers solve in these fields all fit into what we call Quantitative Workflows — the process of starting with data and deriving insights, actions, and quantitative models.