Question to all the aws guys - aws has TB memory instances that are used for SAP HANA (24 TB . I think ) Do you see customers use these types of instances for any bigdata ml training use cases ? Eg if all my data fits in ram I don't have to use any distributed computing framework like Spark etc. Ignore the cost issue.
You can think of logging solutions (elk, splunk) using it. May be some company has a constraint to persists all logs for atleast 180 days.
A lot of the ml algos are piloted on smaller datasets when they are ported to spark for big data they lose accuracy.
Tech Industry
Yesterday
1731
Do people underestimate E6 role at meta?
World Conflicts
Yesterday
547
Is "From the River to the Sea" So Wrong?
Tech Industry
Yesterday
516
Meta interviews: Leetcode review strategy?
World Conflicts
Yesterday
631
Israeli precision-guided munition likely killed group of children playing foosball in Gaza, weapons experts say
India
Yesterday
675
'Hindutva': The Radical Hindu Ideology That Seeks to 'Push Christianity Out of India’
Baller attitude. I like it.