I mean, how do you think websites work? Of course your mouse and keyboard events are available, otherwise you wouldn’t be able to interact with a website at all.
I mean, how do you think websites work? Of course your mouse and keyboard events are available, otherwise you wouldn’t be able to interact with a website at all.
You need to tell it to run the script
Proton is not actually sandboxed the way an actual container is.
A) if the program running in proton was given root access in some way, say by tricking people into entering their root password for a claimed update, it would have complete normal control of your entire system just like normal.
B)apps running in proton still have access to the regular file system.
Wine isn’t an emulator or a vm.
Weirdly enough, messages seem to be the only thing I see the general public being ok with diversifying.
Everyone I know uses different messaging apps, I have active conversations in signal, telegram, Whatsapp, messages(SMS), messages(RCS), discord, and matrix.
It’s because computer science degrees aren’t really programming degrees.
A computer science degree sets you up to be a scientist, most common dev jobs are just glorified Lego sets patching libraries together and constructing queries. There is skill, knowledge, and effort in those jobs, but they are fundamentally different.
Most common software dev jobs are closer to the end user than not.
Thanks for the breakdown! This is probably the most helpful breakdown I’ve seen of a build like this.
Yea I do, you brought up that local isn’t always the option.
I desperately want it to work for me, i just can’t get it to work without spending thousands of dollars on hardware just to get back to the same experience as having a regular desktop at my desk.
What is the cost of the thin clients and are you doing this over copper?
Are your desks multi monitor? To get the bare minimum in my households scenario I would need at least 12 streams at greater than 1080p
For 5 seats how much did it cost versus just having a computer in each location? For example looking at hdbaset to replace just my desk setup, I would need 4 ~$350 devices, just looking at monoprice for an idea (https://www.monoprice.com/product?p_id=21669) which doesn’t even cover all of the screens in my office.
Right, but who has the resources to rent compute with multiple GPUs, this is a gaming setup, not office work, and the op was talking about racking it.
All of those services offer an inferior experience to being at the hardware, it’s just not the same experience. Seriously, try it with multiple 1440p 144hz displays, it just doesn’t happen work out well, you are getting a compromised product for a higher cost. You need a good GPU (or at least a way to decode multiple hvec streams) in in the client, and so, you can run a standard thin client.
‘low latency’ is a near native experience, I’m talking, you sit down at your desk and it feels like you are at your computer(as to say, multiple monitors, hdr, USB swapping, Bluetooth, audio, etc, all working seamlessly without noticeably diminished quality), anything less isn’t worth it, since you can just, use your computer like normal.
A display port to fiber extender is $2,000. The fiber is not for the network.
Moonlight does not do what I want, moonlight requires a GPU on the thin client to decode. You would need a high end GPU to decide multiple high resolution video streams. Also afaik, moonlight doesn’t support multiple displays.
Can this solution deliver 3+ streams of high resolution (1440p or higher and 144fps) low latency video with no artifacting and near native performance and responsiveness?
Gaming has a high requirement for high fidelity and low latency I/O, no one wants to spend all this money on racks and thin clients, the then get laggy windows and scrolling, artifacts, video compression, and low resolution.
That’s the problem at hand with a gaming server, if you want to replace a gaming desktop with a vm in a rack, you need to actually get the I/O to the user somehow, either through dedicated cables from the rack, fiber, or networking, the first is impractical, it involves potentially 100ft long runs of multiple display port, HDMI, USB, etc, and is very rigid in its application, the second is very expensive, shooting the price up to thousands of dollars per seat for display port/USB over fiber extenders, and the third option I have yet to see a vnc/remote solution that can deliver near native video performance.
I should reiterate, the op wants to do fidelity sensitive tasks, like video editing, they don’t just need to work on a spreadsheet.
None of the presented solutions cover the aspect of being in a different place than the rack, the same network is fine, but at a minimum a different room.
How do you deliver high resolution (e.g. 1440p, 144 fps) to multiple monitors with low latency over a network? I haven’t seen anything like that accomplished without running fiber from the host.
Eventually, your thin client will need too much power anyway, making the costs rise a lot. It makes sense in an office where you have 500 seats and you can load balance resources.
If someone can show me a multi seat gaming server that has native remote performance (as in you drag windows around in 144 fps, not the standard artifacty high latency behavior of vnc) I’ll eat a shoe.
5 Mbps is slow enough that it should be considered a free tier, like, basic service for being alive tier.
Yes smartphones and tablets have replaced desktops for most general users.
This is something people fail to realize, and I think part of it is because Linux people tend to surround themselves with other Linux people.
I have been helping my friend get into Linux, we picked a sensible distro, fedora, with the default gnome spin. He loves the UI, great.
But there is a random problem with his microphone, everything is garbled, I can’t recreate it on my hardware and it’s unclear.
He reads guides and randomly inputs terminal commands, things get borked, he re installs, cycle continues.
He tries a different distro, microphone works, but world of Warcraft is funky with lutris, so no go.
The result is, all of this shit just works on windows, and it just doesn’t on Linux. Progress has been made in compatibility, but, for example, there was a whole day of learning just about x vs Wayland and not actually getting to use the computer. For someone who has never opened a terminal before, something as simple to you and I as adding a package repo is completely gibberish
Yes you can learn all of this, but to quote this friend who has been trying Linux for the past two weeks “I’m just gonna re install windows and go back to living my life after work”
When you have 20 years of understanding windows, you need to be nearly 1 to 1 with that platform to get people to switch.
I didn’t know about alien, that is pretty cool.
However this bit from the readme is hilariously on brand for Linux:
"To use alien, you will need several other programs. Alien is a perl program, and requires perl version 5.004 or greater. If you use slackware, make sure you get perl 5.004, the perl 5.003 in slackware does not work with alien!
To convert packages to or from rpms, you need the Red Hat Package Manager; get it from Red Hat’s ftp site. If your distribution (eg, Red Hat) provides a rpm-build package, you will need it as well to generate rpms.
If you want to convert packages into debian packages, you will need the dpkg, dpkg-dev, and debhelper (version 3 or above) packages, which are available on http://packages.debian.org"
Also Linux’s package ecosystem are not cross compatible.
Except knowledge.
It’s foolish of you to assume that most people want to build a computer.
And before people respond with ‘its just Legos’
There is so much more to it for someone with little to no knowledge.
Bios and firmware updates that require certain CPUs coupled with certain motherboards.
CPU sockets and inter compatibility.
The different specs of any given component and the value they provide to someone looking for specific workflows
Sizing of components and cases
Knowing where to find parts and what prices are acceptable.
Etc, etc ,etc.
Pick something that you know nothing about, let’s say cars just as an example.
Now imagine, let’s, say want to buy a car but it doesn’t come with wheels, you don’t get a list of 4 wheels to choose from, You get, lug patterns, sizing, and type, offset, wheel diameter, wheel width, bead lockers or no bead lockers, 1 piece, 2 piece or 3 piece, etc.
Now you have to spend all this time researching just about wheels, and then how they fit with the car you chose specifically earlier in the process, it would be frustrating and incredibly difficult for people who just want a car.
Go on any thread or forum and ask ‘what GPU should I get’ which is already making assumptions about someones understanding and knowledge (that they even know what a GPU is), and you will get 20 conflicting answers and need to write a paragraph in responses to narrow it down enough.
Present someone with no knowledge this: ‘DDR3-2666 CL9’ vs ‘DDR3-2000 CL7’. How do you really expect someone who just wants to play a video game to just implicitly know what those numbers mean, how they relate to each other etc.
Building a computer is an immensely difficult task for someone who doesn’t know much or anything about it, and believe it or not, the reality is not everyone wants to learn, places like lemmy and other tech focused echo chambers seem to forget that.
I think mostly people are defending themselves, when Linux people jump on the harassment train, it’s just that, harassment.
I hate writing and reading xml compared to json, I don’t really care if one is slightly leaner than the other. If your concern is the size or speed you should probably be rethinking how you serialize the data anyway (orotobuff/DB)