In the field of data science, the volumes of data can be enormous, hence the term Big Data. It is essential that algorithms operating on these data sets operate as efficiently as possible. One measure used is called Big-O time complexity. It is often expressed not in terms of clock time, but rather in terms of the size of the data it is operating on. For example, in terms of an array of size N, an algorithm may take N^2 operations to complete. Knowing how to calculate Big-O gives the developer another tool to make software as good as it can be and provides a means to communicate performance when reviewing code with others. In this course, you will analyze several algorithms to determine Big-O performance. You will learn how to visualize the performance using the graphing module pyplot. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
Certificate Available ✔
Get Started / More InfoGain a fundamental understanding of blockchain technology and its components. Learn how such applications as cryptofinance makes use of the blockchain for decentralized,...
This course focuses on the recovery of the 3D structure of a scene from its 2D images. In particular, we are interested in the 3D reconstruction of a rigid scene...
"Excel/VBA for Creative Problem Solving, Part 2" builds off of knowledge and skills obtained in "Excel/VBA for Creative Problem Solving, Part 1"...
En este curso, ofrecido por la UNAM, cubriremos el pasado, presente y futuro de la inteligencia artificial. También mencionaremos los conceptos más importantes...