Sharing LV’s recent talk at CogX. Of the 300-400 speakers she was, I believe, the only women (*correction – handful of women*) speaking on a technical subject so a huge stride forward for #WomenInStem and #IndiansInAcademia (*academia in Britain especially in the higher and more complex echelons is astonishingly white*). It might be shirk to say so but I suspect Vidhi might be Lakshmi in human form..

Judging from what our beloved commentariat constantly snark about me in the threads it’s astonishing she married a lightweight like me 😉

A talk on #gaussianprocesses by the very talented @VLalchand – https://t.co/HIUfnTKJ2p at #CogX2018@turinginst #CogX18 #MachineLearning #AI #Mathematics #science #womeninleadership #WomenInSTEM #indian

— Zachary Zavidé (@ZacharyLatif) June 14, 2018

LV's Talk at CogX on Gaussian Processes

Facebook Live of Vidhi Lalchand's Talk at CogX for the ATI (Alan Turing Institute) on her research area, Gaussian Processes.

Posted by Zachary Zavidé on Monday, June 11, 2018

Nerd

Lakshmi is all things to all people..

Aren’t you cute!😀

Never thought stuff from BP would be something I would forward to my colleagues. So cool!

I wonder how it compares to cross-entropy methods in flexibility.

I’ve been told that 😉

I sent your comment to V, she’ll reply iA.

Not that you guys should discuss BP comments at home, but couple more things for future reference.

1. Inversion of large matrices is difficult but has many developments in finite element methods for large matrix inversion and storage. With domain decomposition method, this isn’t a major disadvantage for GP.

2. GP advantage of smooth twice-differentiable functions is also its disadvantage for step functions. Also co-variance can only model first order relationships between any two variables.

But the presentation appears to be for non-specialist audience in surrogate modelling, and these may not have been discussed. Sorry had to geek out somewhere.😀

Interesting – I don’t understand anything 🙂

I sent her this comment as well.

Hi Violet,

On 1) there is a lot of work going into making GPs sparse by looking at approximations or subsets of data. I think that is more theoretically interesting than a brute force computational approach on just making the inverse part faster. (I thought the fastest algorithms for inverse were O(n^2.3728639) which is better than O(n^3) but still not much…)

2) You can create custom covariance functions to capture any shape or smoothness requirements, as long as they are valid covariance functions (psd). Also, you can have non-stationary covariance structures …

Vidhi welcome to Brown Pundits! Can you contribute articles?

Most people try to solve problems optimizing for specific equilibria, which causes suboptimal whole of system solutions because it doesn’t account for covariance. But geniuses like you can understand whole complex systems (or systems of systems for geeks) and solve for general equalibria (optimize for many variables simultaneously).

Or just write articles on how AI is changing the world.

Note I am supremely stupid as our dear beloved friend and mentor Kabir correctly said.

On estimations of subsets to make Gaussian Process sparse . . . well yeah; can you at a future point share some ways this is done?

Do people still use O(n^3) ?

You lost me (again being truly at the very bottom end of stupidity) when speaking about custom covariance functions and non-stationary covariance structures. I am badly in need of an IQ boost pill. [Go brain therapy, gene therapy and inserted bio-engineered tissue!]

Must be nice to live with someone as smart as Zachary. Especially since he tries to hard to hide his intelligence by pretending not to be intelligent so as not to make the rest of us jealous! I love and envy Zach’s deep intellectual humility. 😉

@VRL,

Thanks so much for dropping by!

1. Intuitively one would think that correlation won’t be strong among a large number of x, and if there is a strong correlation you might end up with a new representative variable (which could itself be a function of all sub variables). So, sparse covariance matrix makes sense. Also FE methods use all possible methods for converting stiffness matrix to sparse matrix (by reordering etc) before inversion . Brute force is when all else fails, but makes us deliver some answers in the meantime.😀

2. Yeah, non-stationarity could be interesting but then I guess it runs into same problems as non-stationary signal representation methods (specifically I am thinking of an earthquake signal in frequency domain with Fourier transform vs. Non-stationary power spectral density). There are a lot more knobs to turn for non-stationarity parameters (e.g., which extra variable should be an indicator for change in covariance structure)

Anyway, fun stuff and good luck!!

Look forward to magic ML to help people like us skip the difficulties of non-linear regression and do what we need to do. 😀

Definitely look forward to more of your pubs.

Zack you riding on coat tails and claim to fame by having a gif of you with the Punditess.

I got to the point of understanding Gaussian Process is a distribution of functions. Sampling the Gaussian process gives a function.

The rest was starting to get beyond me.

Long Island, NY upper class, Sex in the city accent. e.g. saample vs sample. Did the Punditess grow up in NY/Long Island.

yes haha I do ride on her coattails; I’m the shameless Paki husband!

Punditess is Chennai born and bred but spent her last decade in the UK!

Chennai, Lalchand?

Had a classmate Lalchand, I think they were Sindhis

She is Sindhi actually..

Jeay Sindh; the irony is that I’ve always found Karachi to be my favourite city (a spiritual home of sorts) and I somehow ended up marrying a Sindhi.

Her father’s family is from Tande Adam (https://en.wikipedia.org/wiki/Tando_Adam_Khan) and I can’t remember where her mother’s family is from but they are Sindhi Khuranas. Of course most Sindhi Bhaibands are Punjabi Hindus who migrated South (Kirpilanis who I think might be Brahmins descend from a Kirpal Singh who moved to Sindhi some 200/300 years ago).

Her great grandfather’s family used to host Nehru at their Haveli when he would visit Sindh in British India so they are a historically Congress family.

Vidhi’s paternal grandmother told her that they were original Afghan Muslims who moved to Sindh and become Hindus; it’s interesting how the Indus Valley is such a strong borderzone between Aryavarta and Iran-Zamin.

Punditess sounds British! Another word is Pundita.

true that..

I believe Saraswati is the manifestation of Shakti you are looking for. Although, very often Saraswati and Lakshmi go together.

Nice presentation, very clear exposition.

Wisdom and beauty given form. Syncretic Saraswati and Lakshmi Punditess.

You get to live with such a tejaswi (spiritually effulgent) Pundita every day? Not fair. You have way too much “privilege” 🙁

Don’t get me wrong you are a smart guy. Maybe 98 percentile on composite physical health (Sharira Siddhi), mental health (Chitta Shuddhi), intelligence (Buddhi) index of all humans. {Sorry to be the bearer of bad news, but your handsomeness brings you down a few pegs in the composite index.} Vidhi is seriously 1 basis point {top ten thousandth} material.

Sometime you need to share how you ended up with such a self assured, transcendentally wise, heavenly angel. Certainly unearned privilege based on this life-time. You must have done many truly awesome things in past lifetimes.

Vidhi deserves her own theme song. Substitute Vidhi’s name and pictures with Panchali’s in this audio video montage:

https://www.youtube.com/watch?v=Q9mVKNMcGKA

My 1 basis point wife; it sounds catchy!!! Don’t know what I did in my past lives to deserve her but the upshot is after the pheras, she’s now stuck with me for the next 7 lives 🙂

thanks for your kind words otherwise!

Saw the Youtube video. Very instructive. I realize that lots of complex math is behind the general results and intuitions presented in the video but she presented them very accessibly. Your wife is a very good instructor and no doubt very gifted in the intelligence department.

thank you!

Violet wrote:

“Look forward to magic ML to help people like us skip the difficulties of non-linear regression and do what we need to do. 😀

Definitely look forward to more of your pubs.”

Couldn’t agree more!