IBM Research unveiled CodeFlare, a new framework for uniting and scaling AI (artificial intelligence) workflows and big data in a hybrid cloud environment. The open-source framework aims to help developers reduce the time spent creating pipelines to train and enhance machine learning models.
CodeFlare building took place on Ray, which is an open-source technology from UC Berkeley. It builds on Ray with some set of elements, making it easier to scale workflows. With a Python-based interface for pipelines, CodeFlare makes it easy to integrate, parallelize and share data. This helps unify pipeline workflows on multiple platforms while not necessitating data scientists to understand a new workflow language.
While IBM says CodeFlare pipelines run efficiently on the new serverless platform IBM Cloud Code Engine and Red Hat OpenShift, developers can deploy it anywhere. CloudFlare assists developers to integrate and bridge pipelines with other cloud-native ecosystems by giving adapters to event triggers. It provides load and partition data from several sources, including data lakes, cloud object storage, and distributed filesystems.
CloudFlare is there on GitHub, and IBM is setting out examples that run on Red Hat Operate First and IBM Cloud. The developers, who are already using CodeFlare, have cut back their work by months.
Hybrid cloud is a significant part of IBM’s growth strategy. In the fiscal year 2020, IBM’s cloud revenue went to approximately 20% because of Red Hat’s hybrid cloud programs.