![](https://lemmy.radio/pictrs/image/b3bfee80-434d-418e-863e-865c85ce4acd.png)
![](https://programming.dev/pictrs/image/170721ad-9010-470f-a4a4-ead95f51f13b.png)
It’s not real.
Anything and everything Amateur Radio and beyond. Heavily into Open Source and SDR, working on a multi band monitor and transmitter.
#geek #nerd #hamradio VK6FLAB #podcaster #australia #ITProfessional #voiceover #opentowork
It’s not real.
At one point, before we virtualised everything, I had a custom desk built in an L-shape. Instead of a desk and a return, I had the refurbishment team put together a desk with two desks instead. It gave me two sets of drawers, two computer cubby holes and the gap was too small for the horrible keyboard adjustable shelf that kept hitting your knees, so they replaced it with a fixed surface instead.
People laughed.
Colleagues sniggered.
Then they wanted one too.
Now I have a mobile lectern with an iMac clamped to it. Height adjustable, wheels, enough space for keyboard, trackpad and USB hub. I move around my office as the mood or light takes me.
So, when you use 40 or so programming languages, your employer needs to supply a mansion…
I’m okay with that.
Now, where is the boss?
Further discussion here:
Whilst I agree with your opinion, it continues to astonish me that the majority of non-technical people using a search engine have absolutely no idea just how bad the search landscape has become.
I suppose my question did probably exclude that part of the population, but old habits die hard.
I still use + and - to exclude search terms until I remember that Google+ broke that and I forgot just how ad infested the internet is until I accidentally click on a piece of empty space in an article that would have an ad, were it not for the pihole in my network.
So, yeah. Point taken.
I’m guessing that the “Felix Baumgartner descent” is not available as an option…
Wonder if the Boeing board should have been on that flight instead. “We stand behind our product.” “Excellent, now take a deep breath and jump.”
Yes, and some days it even acknowledges that there are humans living outside of New York, or even beyond the United States.
Perhaps you might expand your game “design” team to include people outside those sitting in the same office.
At the rate you’re going, we are enjoying it less every day.
Also, the word you’re looking for is: “headless”, as in, “headless install”
The traditional way is to use a serial console from another device.
Back propagation happens during the creation of the model, not after it’s deployed.
I’m glad you defragged it, rather than fragged it…
Consider the impact of donating to one or more clients as the main project.
It’s an interesting question.
Perhaps I’m not devious enough, but the only impact I can see is insurance companies increasing your fees or denying cover.
You used to be able to run Apple Music on Android. I used it for a while. Not sure if it still exists.
DRM is one potential reason, but not the only one.
Content is licensed under specific conditions, resolution, audio tracks, closed captions, etc. Two organisations might have licensed the same title, but not the same conditions.
You can see this clearly during the Olympics where some channels only have secondary rights, or only certain events, but only free to air, not online, etc.
Added to that are marketing and exclusively deals and in the end it’s anyone’s guess what you actually end up with.
You mean, a messaging app offered by Meta isn’t secure? I’m shocked, I say, shocked!
Anyway…
The underlying issue with an LLM is that there is no “learning”. The model itself doesn’t dynamically change whilst it’s being used.
This article sets out a process that gives the ability to alter the model, by “dialling up” (or down) concepts. In other words, it’s changing the balance of the weight of concepts across the whole model.
Altering one concept is hardly “learning”, especially since it’s being done externally by researchers, but it’s a start.
A much larger problem is that the energy consumption is several orders of magnitude larger than that of our brain. I’m not convinced that we have enough energy to make a standalone “AI”.
What machine learning actually gave us is the ability to automatically improve a digital model of things, like weather prediction, something that took hours on a supercomputer to give you a week of forecast, now can be achieved on a laptop in minutes with a much longer range and accuracy. Machine learning made that possible.
An LLM is attempting the same thing with human language. It’s tantalising, but ultimately I think the idea applied to language to create “AI” is doomed.
I’ve been using Linux for near enough a quarter of a century as my main desktop and I haven’t regretted it yet.
Linux today is plenty easy to use today for a non-technical audience, runs with less resources, has global communities, comes in your language and it’s free.
Okay. Couple of things.
Except that the humpback whale will reproduce long before that marine biologist loses his virginity.