40° - the Laboratory for Innovation - offers knowledge, competencies, tools and platform services in order to gain new insights for the development of the organization, the business model and the markets from customer data and open data.
With our Predictive Data Science approach , we use machine learning and deep learning to learn from well-founded data analyzes and to make derivations for future decisions. We use free tools that can be used in the cloud as well as in the data center at our customers. To effectively use the data and analyzes and to ensure a thorough understanding of the data, we develop customized dashboard and management systems or offer the knowledge in real time via standard interfaces (restfull API).
We are convinced that Artificial Intelligence is one of the most useful inventions of mankind. A technology that we provide with our offer to medium-sized companies .
Our work is practice and results oriented . We approach problems solving iteratively and are always on the lookout for new insights and opportunities. If necessary, we work according to proven project and process management standards such as Scrum (PSM-I), PRINCE2® or ITIL4®.
The proven programming languages for data science, machine learning, deep learning and data lake web services.
Extensive and powerful open source libraries and frameworks for data analysis in Python or R.
Tool for cluster computing and high-performance data analytics to put data at the center of action.
Based on TensorFlow with integrated Keras, we implement models for machine learning and deep learning.
We use Apple's Core / Create ML infrastructure for edge computing and machine learning on the end device.
Interactive data visualization with the aim of promoting the visual understanding of data through extensive data analysis.
Framework and data analysis tools for implementing powerful data structures in Python.
Restful API microservice for high-performance business services and data lakes in the backend. On premise or in the cloud.
Scalable framework for the distributed processing of large amounts of data across clusters of computers.