Wednesday, October 09, 2024

AI: Demis Hassabis: Nobel Prize in Chemistry

How Meta's Movie Gen Does It, AI's Criminal Underground, Court Says LAION is Legal, OpenAI's New Voice API

"One day after the physics prize was announced, Demis Hassabis, John Jumper, and David Baker won the Chemistry Nobel Prize for their work on AlphaFold and protein design. AlphaFold and AlphaFold 2, as well as the work of Baker’s lab, are compelling applications of AI that made significant steps forward in chemistry and biology, and this award, too, is well deserved!

 Work on protein structure and design wins the 2024 chemistry Nobel


Google DeepMind Scientists Win Nobel Prize for AlphaFold AI Project - CNET

Demis Hassabis and John Jumper are honored for their work using AI to predict the structure of proteins.



Sir Demis Hassabis (born 27 July 1976) is a British computer scientist and artificial intelligence researcher. Previously he was a video game AI programmer and designer, and an expert board games player.[5][6][7] He is the chief executive officer and co-founder of DeepMind[8] and Isomorphic Labs,[9][10][11] and a UK Government AI Advisor.[12]

Hassabis and John M. Jumper were awarded half of the 2024 Nobel Prize in Chemistry for their protein folding predictions.[14][15]



...His wife is an Italian molecular biologist, researching Alzheimer’s disease...




AI: Geoffrey Hinton, Nobel Prize in Physics!

Geoffrey Hinton - Wikipedia

Geoffrey Everest Hinton CC FRS FRSC[9] (born 6 December 1947) is a British-Canadian computer scientist, cognitive scientist, cognitive psychologist, known for his work on artificial neural networks which earned him the title as the "Godfather of AI".

In 2024, he was jointly awarded the Nobel Prize in Physics with John Hopfield "for foundational discoveries and inventions that enable machine learning with artificial neural networks." His development of the Boltzmann machine was explicitly mentioned in the citation.


“...I’m particularly proud of the fact that one of my students fired Sam Altman. And I think I better leave it there.”

Hinton was referring to Ilya Sutskever. The former chief scientist at OpenAI joined Helen Toner and two others on the controlling nonprofit board to sack their CEO in a spectacular coup last November. Sutskever quickly regretted his role plunging OpenAI into crisis and Altman was returned to his role in a matter of days.



Congratulations to Geoff Hinton and John Hopfield for winning the 2024 Physics Nobel Prize! It’s wonderful to see pioneering work in AI recognized, and this will be good for our whole field.




AI embeddings: Cosine similarity versus dot product

classification - Cosine similarity versus dot product as distance metrics - Data Science Stack Exchange

It looks like the cosine similarity of two features is just their dot product scaled by the product of their magnitudes. When does cosine similarity make a better distance metric than the dot product?

Think geometrically. Cosine similarity only cares about angle difference, while dot product cares about angle and magnitude. If you normalize your data to have the same magnitude, the two are indistinguishable. Sometimes it is desirable to ignore the magnitude, hence cosine similarity is nice, but if magnitude plays a role, dot product would be better as a similarity measure. Note that neither of them is a "distance metric".


How to Implement Cosine Similarity in Python | by DataStax | Medium

If the cosine similarity is 1, it means the vectors have the same direction and are perfectly similar. If the cosine similarity is 0, it means the vectors are perpendicular to each other and have no similarity. If the cosine similarity is -1, it means the vectors have opposite directions and are perfectly dissimilar.

A = [5, 3, 4]
B = [4, 2, 4]

# Calculate dot product
dot_product = sum(a*b for a, b in zip(A, B))

# Calculate the magnitude of each vector
magnitude_A = sum(a*a for a in A)**0.5
magnitude_B = sum(b*b for b in B)**0.5

# Compute cosine similarity
cosine_similarity = dot_product / (magnitude_A * magnitude_B)
print(f"Cosine Similarity using standard Python: {cosine_similarity}")


embeddings dot product vs cosine Similarity - Google Search

In data science, the dot product and cosine similarity are both used to measure similarity between vectors, but they differ in how they do so:

Dot product
This metric is a fundamental similarity metric that sums the products of corresponding elements in two vectors. It reflects both the direction and magnitude of the vectors, and can range from negative to positive infinity. A value of 0 indicates that the vectors are perpendicular, while negative values indicate opposite directions and positive values indicate alignment. The dot product is especially useful for high-dimensional vectors.

Cosine similarity
This metric is the cosine of the angle between two vectors, or the dot product between their normalizations. It normalizes the value to account only for direction, and results in a range from -1 to 1. Cosine similarity is often used to quantify semantic similarity between high-dimensional objects.

The choice of dot product or cosine similarity depends on the type of similarity being measured. For example, the dot product takes raw word count more into account, while cosine similarity takes proportional word distribution more into account.

export function dotProduct(a, b) {
    return a.map((value, index) => value * b[index]).reduce((a, b) => a + b, 0);
}

export function cosineSimilarity(a, b) {
    const product = dotProduct(a, b);
    const aMagnitude = Math.sqrt(a.map(value => value * value).reduce((a, b) => a + b, 0));
    const bMagnitude = Math.sqrt(b.map(value => value * value).reduce((a, b) => a + b, 0));
    return product / (aMagnitude * bMagnitude);
}