

But while I did not know the concept of loops, I did deduce functions from how main() is being used.
Entire thing was one big recursion.
Bro discovered functional programming.
If it has to be live (i.e. works in real-time), doing the translation in a server elsewhere is probably a bad idea due to the latency.
It makes more sense to run an optimised translation model locally - ideally in the AirPods directly but if not, on the connected iPhone.
Someone with the expertise should correct me if I am wrong; it’s been 4-5 years since I learnt about NPUs during my internship so I am very rusty:
You don’t even need a GPU if all you want to do is to run - i.e. perform inference with - a neural network (abbreviating it to NN). Just a CPU would do if the NN is sufficiently lightweight. The GPU is only needed to speed up the training of NNs.
The thing is, the CPU is a general-purpose processor, so it won’t be able run the NN optimally / as efficiently as possible. Imagine you want to do something that requires the NN and as a result, you can’t do anything else on your phone / laptop (it won’t be problem for desktops with GPUs though).
Where NPU really shines is when there are performance constraints on the model: when it has to be fast (to be specific: have real-time speed), lightweight and memory efficient. Use cases include mobile computing and IoT.
In fact, there’s news about live translation on Apple AirPod. I think this may be the perfect scenario for using NPUs - ideally housed within the earphones directly but if not, within a phone.
Disclaimer: I am only familiar with NPUs in the context of “old-school” convolutional neural networks (boy, tech moves so quickly). I am not familiar with NPUs for transformers - and LLMs by extension - but I won’t be surprised if NPUs have been adapted to work with them.
I’m pretty sure most video games require AI already. I stuggle to name ones that don’t use AI. Some that I can think of are snake, two-player pong and two-player chess.
Neural nets, on the other hand - I find it hard to imagine running a NN locally without impacting the game’s performance.
I don’t like it actually but I have over 5k hours in Dota 2.
Average dota 2 experience.
Did you generate this comment with a LLM for irony?
Those that I find the most useful are those that I (and likely many others) tend to take for granted.
For example, fuzzy logic may very well be used in electronics that involve temperature control - fridge, aircon, rice cooker, water heater - under the hood.
Another one is CSP (constraint-satisfaction problems) solvers which tend to be used in scheduling softwares. A possible use case is public transportation.
There are probably lots more AIs working behind the scenes that benefit everyone, but don’t get the coverage because they are just boring tech now. People may not even consider them AI!
I appreciate these AI for making my life so convenient.
Necrowomancers imply the existence of Necrowowomancers.
??
Just thought the phrasing is funny because it is like ATM machine.
ETF index fund
ETF is exchange-traded fund, so “ETF index fund” becomes “exchange-traded fund index fund”
Personally, I found Khan Academy helpful.
For context I studied computer science with a focus on artificial intelligence, machine learning and data science.
Given your background, you may also be interested in Georgia Tech OMSCS’s Machine Learning for Trading.
Unless you have plenty of time and knowledge - in which case you might as well be a daytrader or join a hedge fund or HFT - you are likely better off taking the easy way out by buying index ETFs.
This post is more fitting for !anime@ani.social IMO.
I’m in the camp for “Having a few true friends is better than knowing multiple acquaintances”.
Having two close friends is something to be proud of.
what is the difference between current AI and the human brain?
My understanding is that: the fields of neuroscience and psychology are not developed enough (at this point in time) for anyone to provide a definitive answer to this question.
Anyone who claims otherwise would probably have to make assumptions, and may be talking out of their ass.