Tales of Science & Data
Meta
About me
GitHub
Twitter
Search…
Tales of Science and Data
Meta & resources
The meta on all this
Beautiful web of data science
Probability, statistics and data analysis
Probability, its interpretation, and statistics
Foundational concepts on distribution and measures
Hypothesis testing
Methods, theorems & laws
Notable brain teasers, paradoxes and how to be careful with data
Machine Learning: concepts & procedures
Overview of the field
Learning algorithms
Feature building and modelling techniques
Dimensionality reduction and matrix factorisation
Principal component analysis
Non-negative matrix factorisation
Singular value decomposition
Machine Learning: fundamental algorithms
Learning paradigms
Supervised learning
Unsupervised learning
Machine Learning: model assessment
Generic problems models can have
Performance metrics and validation techniques
Diagnostics
Artificial neural networks
Overview of neural networks
Types of neurons and networks
Natural language processing
General concepts & tasks in NLP
Manipulating text and extracting information
Topic Modelling
Word Embeddings
Computer vision
Intro: quantifying images & some glossary
Processing an image
What's in an image
The computer science appendix
What's this
Notes on foundations
Essential algorithms
The mathematics appendix
Matrix algebra notes
Mathematical functions
Some geometry
Cross-field concepts
(Some) mathematical measures
Toolbox
The Python data stack
Databases and distributed frameworks
Notebook tools
Powered By
GitBook
Dimensionality reduction and matrix factorisation
These techniques are employed to reduce the dimensionality of the training set when it's convenient, and to factorise matrices in order to separate information. We put them together as they are mathematical tools to be applied to matrices.
Contents
Principal component analysis
Non-negative matrix factorisation
Singular value decomposition
Previous
Regularisation techniques
Next
Principal component analysis
Last modified
2yr ago
Copy link
Contents
Contents