2
information
Article
Machine Learning in Python: Main Developments
and Technology Trends in Data Science, Machine
Learning, and Artificial Intelligence
Sebastian Raschka 1,*,†, Joshua Patterson 2 and Corey Nolet 2,3
1 Department of Statistics, University of Wisconsin-Madison, Madison, WI 53575, USA
2 NVIDIA, Santa Clara, CA 95051, USA;
[email protected] (J.P.);
[email protected] (C.N.)
3 Department of Comp Sci & Electrical Engineering, University of Maryland, Baltimore County, Baltimore,
MD 21250, USA
* Correspondence:
[email protected]
† Current address: 1300 University Ave, Medical Sciences Building, Madison, WI 53706, USA.
Received: 6 February 2020; Accepted: 31 March 2020; Published: 4 April 2020
Abstract: Smarter applications are making better use of the insights gleaned from data, having an
impact on every industry and research discipline. At the core of this revolution lies the tools and the
methods that are driving it, from processing the massive piles of data generated each day to learning
from and taking useful action. Deep neural networks, along with advancements in classical machine
learning and scalable general-purpose graphics processing unit (GPU) computing, have become
critical components of artificial intelligence, enabling many of these astounding breakthroughs and
lowering the barrier to adoption. Python continues to be the most preferred language for scientific
See “Machine Learning with Python,” a special issue of Information (ISSN 2078-2489)
https://www.mdpi.com/journal/information/special_issues/ML_Python