The VDK provides tools for data modeling, data object code generation, access to NoSQL data repositories, and data analysis modules for machine learning, natural language processing, and others.
With the VDK, your team can build, test, and deploy Intelligent Applications with processes to iterate and evolve your applications.
GUI and command-line tools are available to assist data modeling and deploying your code.
When it comes time to scale your application across additional server resources, push your code to the Vital AI Application Platform modules for deployment.
Many code examples can be found in our github repository.
Vital AI provides a low-level Core data model which allows different data processing components to have a common data framework. This Core data model is extended to include objects for common use cases, such as "User", "Document", "Event", and others. This Vital Domain Model is extended to include objects that are required for an application. This becomes the Application Domain Model. For example, if an application will recommend movies to users, then the application may extend the domain model to include objects for "Film", "Actor", and "Genre". VitalSigns provides development tools to define a data model across the entire application and generate objects, which are then used in various components, including the Application's User Interface, Vital Flows, and Spark/Hadoop. This means that the definition of the "User" object is the same in the Application's User Interface, in Vital Flows for recommendations, and in the Spark/Hadoop machine learning jobs. This speeds development and eliminates many problems with data incompatibilities. VitalSigns handles data mapping across components, across different programming languages and data repositories.
With multiple data analysis modules, Vital AI enables various types of Artificial Intelligence.
With machine learning, you can categorize data or make numerical predictions.
With Natural Language Processing, you can categorize text, extract entities (names of people, places, organizations, and things), extract sentiment, extract relationships.
With graph analysis, including social network analysis, you can determine important items and people in networks.
Using logical inference, you can use rules and infer new insights from your data.
See our presentation from the November 2015 Enterprise Dataversity for information about optmizing the data supply chain.
See our presentation from the August 2015 NoSQL Now! for information about MetaQL.
See our presentation from the August 2014 Semantic Technology Conference for information about data models with Big Data.
See our presentation from the October 2013 NYC Semantic Technology Conference for information about developing Intelligent Apps.