AI Research at LF Deep Learning Foundation Gains 2 New Projects
Joining are a machine-learning platform and an elastic deep-learning framework that will expand its ecosystem in AI, deep learning and more.
August 31, 2018
Five months after launching as a major resource for deep learning, AI and machine learning, the LF Deep Learning Foundation is growing with the addition of two new open-source projects that will continue to expand its mission.
The new initiatives are The Angel Project, which is a high-performance distributed machine-learning platform, and EDL, an Elastic Deep Learning framework designed to help deep-learning cloud service providers as they build cluster cloud services.
The Angel Project, based on Parameter Server and runs on YARN and Apache Spark, was contributed by LF Deep Learning Foundation member Tencent of China. Tuned for performance with big data, Angel supports large and complex models with billions of parameters and implements a variety of machine-learning algorithms using efficient model-updating interfaces and functions, according to the Foundation. The Angel Project has more than 1,000 commits.
Xiaolong Zhu, Tencent’s senior AI researcher and technical advisory committee member of the LF Deep Learning Foundation, said Angel shares the Foundation’s goal of making deep learning easier to use.
“By becoming a part of the LF Deep Learning Foundation, we believe Angel will be more active in the open-source community, accumulate more use cases, expand usage scenarios and actively cooperate with other partners,” he said. “As a new project under the Foundation, Angel will continue working on a consistent and continuous user experience to make deep-learning technology easier to apply and develop.”
The EDL Project
EDL uses deep-learning frameworks such as PaddlePaddle and TensorFlow to help deep-learning cloud service providers build cluster cloud services, according to the Foundation. EDL was contributed by Chinese technology company, Baidu, and has almost 1,000 commits by contributors. EDL includes a Kubernetes controller, a PaddlePaddle auto-scaler and a new fault-tolerable architecture, according to the Foundation.
PaddlePaddle is an easy-to-use and scalable distributed deep learning platform, while and TensorFlow is an open source machine-learning framework available to anyone.
“We are excited to see that EDL has been accepted to the LF Deep Learning Foundation,” Yanjun Ma, the head of the deep learning technology department at Baidu, said. “As an elastic deep-learning framework for PaddlePaddle, we believe that EDL will substantially benefit the deployment of large-scale deep learning services and the broader deep learning open-source community.”
Ofer Hermoni, the chairman of the Foundation’s Technical Advisory Council, told Channel Futures that the two new projects will help drive the group’s mission to make access to these technologies easier for companies of all sizes, not just the bigger players such as Google, Facebook and Amazon.
“We are trying to democratize it,” said Hermoni, who works for software and services vendor Amdocs as the director of product strategy in the CTO’s office. That also means making the technologies available to the channel and end users, he added.
Ofer Hermoni
Ofer Hermoni
“Anyone can use it and anyone can join the effort and contribute and learn and influence the projects,” he said. “Today, AI and machine learning are dominated by only a relatively small group of companies. I believe this Foundation will totally change the industry.”
Companies and channel developers can get involved with the projects on GitHub by joining the efforts and contributing, he said.
Several analysts said the new projects offer interesting possibilities in the crowded and emerging fields of AI, machine learning and deep learning.
“What will be important is for these projects to clearly focus on a market segment that is not already covered by other open-source machine learning or deep-learning technologies,” said Al Gillen, an analyst with IDC.
On the other hand, he added, he has mixed confidence about the emerging technologies, because while many are open source, they are often being portrayed as cloud services versus commercial products that customers can deploy on premises.
“However, ironically at some level, the consumption model associated with these open-source solutions tends to include some level of lock-in, whether it is from surrounding technologies or from the training data or the content data itself living in a given cloud,” said Gillen.
At the same time, though, these technologies are coming and businesses must be ready, he said.
“All organizations are going to be consuming machine learning, deep learning and artificial-intelligence technologies, so the more options that are available to enterprises, the more choices they have to work with.”
Another analyst, Karl Freund of Moor Insights & Strategy, said the Angel project in particular will make enhancements available for Apache users who are extending their big-data projects with machine-learning tools.
The EDL tool set, meanwhile, will enable communications service providers (CSPs) to build deep-learning services, although he said he’d be surprised if the larger CSPs need that kind of help.
“Smaller CSPs, however, will benefit from EDL,” added Freund.
Charles King, principal analyst with Pund-IT, told Channel Futures that like a lot of other open-source projects, these efforts might not be quite ready to have a commercial impact.
“I’m not sure it would qualify it as channel-ready at this point,” he said.
The LF Deep Learning Foundation, which is a community umbrella project of The Linux Foundation, was launched in March with a sole project, the Acumos AI Project.
About the Author
You May Also Like