Difference makes the DIFFERENCE

**Link: https://padhai.onefourthlabs.in/**

https://www.youtube.com/channel/

UC337_KN0yuNvzeZnlstEa_A/featured

- Make yourself familiar with the Python 3 datatypes, inbuilt data structures, control structures and functions using the following files:

https://www.youtube.com/watch?v=OMGt-jcMlCs

My view: The First video, that anyone, interested in Machine Learning or Artificial Intelligence courses, should listen, not just hear and understand. This video is from**OneFourth Labs.** - Brush up Python - Jupyter Notebook Link: http://bit.ly/ofl-python
- Brush up Python - Jupyter Notebook in PDF: Brush up Python.pdf
- Python introductory tutorial: https://beginnersbook.com/2018/03/python-tutorial-learn-programming/
- A Free introductory course on python for data science: https://www.datacamp.com/courses/intro-to-python-for-data-science
- Google python class: https://developers.google.com/edu/python
- Detailed python 3 documentation: https://docs.python.org/3/tutorial/

- Linear Algebra: http://www.deeplearningbook.org/contents/

linear_algebra.html - Calculus: http://wiki.fast.ai/index.php/Calculus

_for_Deep_Learning - Matrices and Calculus: https://explained.ai/matrix-calculus/index.html

- Checkout the following free courses in Khan Academy:

Linear Algebra: https://www.khanacademy.org/math/linear-algebra - Calculus Intro: https://www.khanacademy.org/math/calculus-1
- Multivariate Calculus: https://www.khanacademy.org/math/multivariable-calculus

**Link to install Math Plot Library:**Here, we have the required commands to install respective

files. Here, we have examples, tutorials and APIs.

In Visual Studio Code environment, execute the given command from terminal:

at terminal in VSCode or at working Python working folder type (eg: C:/> pip install -U matplotlib)

https://matplotlib.org/users/installing.html

Mathplotlib gets installed. This needs internet connection.

https://matplotlib.org/index.html**Link to install NumPy:**NumPy is the fundamental package for scientific computing with Python. It contains among other things: a powerful N-dimensional array object; sophisticated ...

at c:/> pip install -U pip (pip is upgraded, if not already upgraded)

at C:/> pip install -U numpy

NumPy gets installed. This needs internet connection.

Link to Source: https://docs.scipy.org/doc/numpy/

user/quickstart.htmlhttps://angel.co/projects/518942-human-activity-recognition-machine-learning

http://robots.ox.ac.uk/~minhhoai/papers/

SegReg_CVPR11.pdf (I couldnot download this link, if you can, please send me a copy of the same)In general if we do not use sns.set(), the background would be plain while displaying the graph.

For a better visual understanding we use sns.set().

https://seaborn.pydata.org/tutorial/aesthetics.htmlAn Introduction to Statistical Learning -

http://www-bcf.usc.edu/~gareth/ISL/Google Machine Learning Course -

https://developers.google.com/machine-learning/crash-course/Essentials of Metaheuristics -

https://cs.gmu.edu/~sean/book/

metaheuristics/Essentials.pdf

- A few useful things to know about Machine Learning -

https://homes.cs.washington.edu/~pedrod/

papers/cacm12.pdf - A byte of Python eBook (to read on desktop): https://python.swaroopch.com/ It is easy to keep this book open on a browser and practice with Python command prompt or with any editors available. Pdf, Mobi, eBook links are also available towards the end of this webpage.

If the above link doesn't work, you can download pdf version from this link. - Installing NumPy on windows: This link provides with a clear description on installing NumPy package in windows environment.

http://web.cs.wpi.edu/~cs1004/c17/Resources/Windows/

SettingUpPython_Windows.pdf

(if this file is not found at the source, you can have it from this portal: Link to file) - Pdf File: (Adaptive Computation and Machine Learning) Kevin P. Murphy-Machine Learning_ A Probabilistic Perspective-The MIT Press (2012) - Link to PDF
- Pattern Recognition And Machine Learning -

Springer 2006 - Link to PDF - http://greenteapress.com/thinkstats2/thinkstats2.pdf
- Feature Engineering Book from O'Reilly. This book is of over 200 pages. It may be helpful for reference purpose.

**basic source: PadhAI Course-ware - Chat Group**

Various other sources on Python, AI, DL and ML (Artificial Intelligence, Deep Learning and Machine Learning)

- One of the best places to learn python, AI and the related is: https://www.kaggle.com/learn/overview
“A Simple Neural Network from Scratch with PyTorch and Google Colab” by Elvis https://link.medium.com/i5BpfxQ4KT

https://pytorch.org/tutorials/beginner/nlp/

pytorch_tutorial.html- For newbies in Pytorch:

Tip: You type every single command mentioned in the docs to get the familiarity.

https://pytorch.org/tutorials/beginner/

deep_learning_60min_blitz.html - Pytorch free course

https://in.udacity.com/course/deep-learning-pytorch--ud188 - http://vision.stanford.edu/teaching/cs131_fall1617/

schedule.html - Udemy Deep Learning Prerequisites:

https://www.udemy.com/deep-learning-prerequisites-the-numpy-stack-in-python/

The Numpy, Scipy, Pandas, and Matplotlib stack: prep for deep learning, machine. https://airtable.com/invite/r/nfHfFokR

Airtable: Organize anything you can imagine

Airtable works like a spreadsheet but gives you the power of a database to organize anything. Sign up for free.The Resources page on this website contains a list of hand-selected goodies that are regularly recommended to both beginners and experts. https://pythondiscord.com/info/resources

Try to understand how these tutorials work https://www.tensorflow.org/tutorials

Run them with google colabUdacity Data Science Mega Link if anyone needs

https://mega.nz/#F!qrpxSIRD!PClG5ZMHdd5FroIFTT_Z5QCodecademy is another way to learn how to code. It's interactive, fun, and you can do it with your friends.https://www.codecademy.com/catalog/

language/pythonBest Books And Sites For Machine Learning

How to get started in machine learning? Python is here because if you are new to machine learning and new to programming then Python would be a really good choice. But really machine learning

https://www.houseofbots.com/news-detail/4685-1-best-books-and-sites-for-machine-learningUdemy -> free course

Introduction to PyTorch and Machine Learning | Udemy

Learn the basics of ML & PyTorch - Free Course

https://www.udemy.com/intro-to-pytorch-and-machine-learning/SoloLearn: Learn to Code - This is a website based working portal, where each next module is unlocked only if the current work is completed with a quiz or multiple choice question. -

*So one is forced to learn in a sequence*Join Now to learn the basics or advance your existing skills. Link: https://www.sololearn.com/

This website presents a series of lectures on quantitative economic modeling, designed and written by Thomas J. Sargent and John Stachurski.

Link: https://lectures.quantecon.org/py/Recommended Python learning resources:

https://forums.fast.ai/t/recommended-python-learning-resources/26888Unofficial Windows Binaries for Python Extension Packages.

MIT Machine Learning and Statistics -

https://ocw.mit.edu/courses/sloan-school-of-management/15-097-prediction-machine-learning-and-statistics-spring-2012/index.htmMIT Linear Algebra Course -

https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/If this has been useful, then consider giving your support by buying me a coffee

https://ko-fi.com/pythonprogrammerBook on Amazon Link: https://amzn.to/2ImYVhC

https://www.coursera.org/specializations/mathematics-machine-learning

In general if we do not use sns.set(), the background would be plain while displaying the graph.

For a better visual understanding we use sns.set().

https://seaborn.pydata.org/tutorial/aesthetics.htmlArtificial Intelligence - A Modern Approach - https://amzn.to/2Ip0Wdi (book)

An Introduction to Statistical Learning -

http://www-bcf.usc.edu/~gareth/ISL/Breast cancer classification with Keras and Deep Learning - PyImageSearch

https://codequs.com/p/BkaLEq8r4/a-complete-machine-learning-project-walk-through-in-python

https://www.pyimagesearch.com/2019/02/18/breast-cancer-classification-with-keras-and-deep-learning/

This website presents a series of lectures on quantitative economic modeling, designed and written by Thomas J. Sargent and John Stachurski.

https://forums.fast.ai/t/recommended-python-learning-resources/26888

Guys this might be helpful for future purpose. https://becominghuman.ai/cheat-sheets-for-ai-neural-networks-machine-learning-deep-learning-big-data-678c51b4b463

Anyone who want to open Jupyter notebook (.ipynb) file offline in PC, first you need Jupyter library file. You can get it by typing

*pip install --upgrade pip*

pip install jupyter

Or if you are using anaconda/Miniconda you don't need to install anything.

For remaining steps see this video

https://youtu.be/esKiryoBJBICheat Sheet of Machine Learning and Python (and Math) Cheat Sheets

If you like this article, check out another by Robbie: My Curated List of AI and Machine Learning Resources

https://in.udacity.com/course/ai-artificial-intelligence-nanodegree--nd898

Artificial Intelligence Course | UdacityLearn essentials of machine learning with from experts like Norvig and Sebastian

Demystifying Entropy

https://towardsdatascience.com/demystifying-entropy-f2c3221e2550The research paper is all about implementation of our project including extraction of text region..

https://www.researchgate.net/publication/311578437

_TextNon-text_Image_Classification_in_the_Wild_with_

Convolutional_Neural_Networks

*In many cases, the information given below the video (under the show-more button is more useful than the video itself, so, please have a careful view for these contents as well.*

- Kaggle i think so this will help u for sure https://www.youtube.com/watch?v=Gp_qv317Gew
- The following link seems to be good for a new comer to Python programming language

https://www.youtube.com/watch?v=RjMbCUpvIgw - Linear algebra

https://www.youtube.com/watch?v=ZK3O402wf1c&list=PL49CF3715CB9EF31D YouTubeHarvard University

https://www.youtube.com/watch?v=KbB0FjPg0mw&index=1&list=PL2SOU6wwx

Lecture 1: Probability and Counting | Statistics 110

B0uwwH80KTQ6ht66KWxbzTIo- Neural Networks:

https://www.youtube.com/watch?v=qv6UVOQ0F44 - Machine Learning: https://www.youtube.com/watch?v=UzxYlbK2c7E&list=PLA89DCFA6ADACE599
- https://www.youtube.com/watch?v=tGyfmzuR4d4
- Visualizing 4D Geometry - A Journey Into the 4th Dimension [Part 2]
- How Machines Learn
- PyTorch vs TensorFlow: The Force Is Strong With Which One? | Which One You Should Learn? | Edureka
- The incredible inventions of intuitive AI | Maurice Conti
- If anyone wants to go for python course along with data structures and algorithms:

https://onlinecourses.nptel.ac.in/noc19_cs08/unit?unit=22&lesson=25. - But what *is* a Neural Network? | Deep learning, chapter 1 - YouTube

https://youtu.be/aircAruvnKk?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi

In course of time, some of the links may not be functional...,

if you find any such links, please do let me know with correct adderss, so that the other person may have a valid link.

**I don't take any credit for the following links as they are not my own (with exception to a very few) **

**but are ****from the course, that I am opting, and from other relaetd chat resources.**

- https://www.sciencedirect.com/science/
- article/abs/pii/S0092656608000469
- https://www.independent.co.uk/life-style/how-to-spot-liar-lying-seven-tips-human-detector-darren-stanton-a7487351.html
- https://en.wikipedia.org/wiki/Lie_detection
- https://www.sciencedirect.com/science/article/

abs/pii/S0092656608000469 - https://www.nap.edu/read/10420/chapter/8#157
Data Visualization Society Launches

A new community for data visualization professionals has launched. It is called the Data Visualization Society.

https://101.datascience.community/2019/03/02/data-visualization-society-launches/**Analytics India Magazine**

FB’s New Python Library Nevergrad Provides A Collection Of Algorithms That Don’t Require Gradient Computation

Nevergrad, an open-sourced Python3 toolkit by Facebook for developers offers an extensive collection of algorithms

https://www.analyticsindiamag.com/fbs-new-python-library-nevergrad-provides-a-collection-of-algorithms-that-dont-require-gradient-computation/- TensorFlow Dev Summit 2019 Livestream

https://youtu.be/bDZ2q6OktQI - https://dms.licdn.com/playback/C5105AQGIvn6zJ3jAGQ/

519bdcda628946c4bb634188671d8e66/feedshare-mp4_500-captions-thumbnails/1507940118923-hysdc8?e=1551603600&v=beta&t=b7pjgrGrSFZGKg2M-Mmspds-zdneMOj5Y5nqCNV0Cq8

my publication on Medium website (as per course)

YouTube Publication:

**Last Update Date: 28 Mar 2019**

**Question: **

Consider a message A which can take on 16 values. The probabilities of each of these 16 values are 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, 1/128, 1/128. You made an error in estimating these probabilities and came up with the following estimation for the 16 values: 1/128, 1/128, 1/64, 1/32, 1/16, 1/8, 1/4, 1/2 (in other words, you estimated highest probability for the least possible message and so on. What is the KL divergence between the true distribution and your estimated distribution (choose the closest rounded off number)?

**Answer: **

True Entropy

H1=-((1/2)*math.log2(1/2)+(1/4)*math.log2(1/4)+(1/8)*math.log2(1/8)+

(1/16)*math.log2(1/16)+(1/32)*math.log2(1/32)+

(1/64)*math.log2(1/64)+(1/128)*math.log2(1/128)+(1/128)*math.log2(1/128))

Cross Entropy

H2=-((1/2)*math.log2(1/128)+(1/4)*math.log2(1/128)+(1/8)*math.log2(1/64)+

(1/16)*math.log2(1/32)+(1/32)*math.log2(1/16)+(1/64)*math.log2(1/8)+

(1/128)*math.log2(1/4)+(1/128)*math.log2(1/2))

KL Divergence

H2-H1=4.5234(approx=4.5)

**Question: **

Consider the message A which can take on one of 8 values. The probability of each of these values is 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, 1/128, 1/128. What is the ideal number of bits that you should use to transmit this message (choose the closest rounded off number)?

**Answer:**

Consider the distribution table below-

X IC(# Bits)

1/2 1

1/4 2

1/8 3

1/16 4

1/32 5

1/64 6

1/128 7

1/128 7

So,Ideal number of bits= Expected Number of bits=

1/2*1+ 1/4*2 + 1/8*3 + 1/16*4 + 1/32*5 + 1/64*6 + 1/128*7 + 1/128*7 = 1.98 (rounding off) -->2

**Question: **

Ramesh is supposed to report to office at 9 am every morning. In the past 30 days, he was ether late by 10 minutes or 15 minutes or he came 5 minutes before time or exactly on time. More specifically, he was on time on 4 days, late by 10 minutes on 10 days, late by 15 minutes on 5 days and 5 minutes before time on 11 days. What is the expected time at which he will reach office today ?

**Answer:**

Sir while explaining has actually missed dividing the # of occurrences ( in this example its 30 ) to find expectation.

9:00 AM x 4 days = 0 mins delay

9:10 AM x 10 days = 100 mins delay

9:15 AM x 5 days = 75 mins delay

8:55 AM x 11 days = - 55 mins delay

= 0 mins + 1000 mins + 75 mins late - 55 mins early

= 120 mins

Hence 120 mins /30 days = 4 mins / day

So 9:00 AM to 4 mins today hence 9:04 AM

**Question:**

In the past 1024 days, the Cruciatus curse was used once at Hogwarts. Now consider a random variable X which will take on the value 1 if the Cruciatus curse is used and the value 0 if it is not used. What is the Information Content of the even X=1

**Answer:**

- log(P(x)) is the formula

P(x) = 1 / 1024

So, substituting P(x) in the formula,

-log(1/1024) = -log (1024 ^ -1) = - (-1) * log(1024) = log(2^10)

We know, log base 2 of 2 is 1, and log(m^n) = n log m

then, the ans is 10.

**Question:**

Consider a discrete random variable which can take one of n possible values. State whether True or False: The entropy of this random variable is maximum when all the n values are equally likely.

**Answer: 1:**

If we put more probability mass into one event of a random variable, we will have to take away some from other events. This one will have less information content and more weight, the others more information content and less weight. Therefore the entropy being the expected information content will go down since the event with lower information content will be weighted more.

As an extreme case imagine one event getting a probability of almost one, therefore the other events will have a combined probability of almost zero and the entropy will be very low.

**Answer: 2:**

Think about it this way, if all events are equally probable, then you don't know which event is most likely to happen, hence entropy is maximum. If probability of one event is higher, then you can be more sure that this event is likely to happen. **Entropy can be understood as the degree of confusion that you have about what will happen**

If you find any useful links and can share,

please post them, here, I will, include to the above list.