blace.ai - closing the gap between research and production
c++ library for local machine learning inference
Our library makes it super easy running your ai models locally on your users machine. We support Windows, Linux and MacOS and provide you with a set of tools to port your existing models into our framework.
Bringing your ai research into production can be cumbersome task. Our library takes care of all deployment related tasks, model preparation and continuous checking, backend optimization, versioning and testing. Adding new models to our framework is easy. Additionally you get access of already optimized models in our database.
The MIssing Link between research and Production
Node-based executions Graphs
We provide you with a comprehensive set of c++ tools to build highly optimized and cacheable computation graphs for all ai related tasks. You don’t have to care about memory management, hardware differences or other low-level tasks but can focus on the product.