[ / / / / / / / / / / / / / ] [ dir / animu / dempart / komica / leftpol / mde / tingles / ungall / vg ]

/tech/ - Technology

Winner of the 68rd Attention-Hungry Games
/d/ - Home of Headswap and Detachable Girl Threads

January 2019 - 8chan Transparency Report
Comment *
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Show oekaki applet
(replaces files and can be used instead)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, pdf
Max filesize is 16 MB.
Max image dimensions are 15000 x 15000.
You may upload 3 per post.


any of you fucking larpers actually use a minimal system for day-to-day computing? With minimal I mean doesn't jump up to 200W of power consumption when I open a text file. I'm talking minimal hardware here.

I keep looking at ARM computers for that but they're mostly retarded chinkware that's borderline impossible to set up with a mainline kernel. Every time I tried to get into shit like the Pi it ended up being a disappointment for various chink-engineering related reasons. My wish would be a minimal system that can do shell, some light graphics, browsing not required and no fucking fans. I don't think it's possible without x86 and the inherent bloat, botnet, fans and high power consumption that has. Why isn't this a thing in two-thousand-fucking-nineteen? Also none of the chinkware I looked at can drive anything higher than 1080p. What the shit is up with that?! I have a 4k monitor and sadly got fucking used to it. Should I go back to 4:3 screens /tech/? I just wanna simplify my life somewhat.


>muh minimalism

>muh 4k

>muh 16:9

>muh not using the sqt



Just model your own processor, it is not that hard anon. Or if it is, there are plenty of other soft microprocessors you can implement in an FPGA. You can even run linux on Xilinx's Microblaze. I'd recommend you take it a step further though, and write your own OS from scratch. If you're white, it's not that hard, but it would help if you were commanded by God to build such a temple.



It seems you are the larper here. You're basacally overengineering a fucking computer without realising even x86 processors have become better on every metric. Use x86 and jump ship if you manage to get everything working on another chip.


File: e6d45c93ff7e7d5⋯.jpg (90.87 KB, 1280x720, 16:9, maxresdefault.jpg)

You need a Pentium, Joey. You can't just go around writing python scripts on ARM systems.


Don't lie to yourself. If you actually wanted to simplify your life then you would be using fucking Windows 10.

The only reason to use Loonix is not being spied on and some narrow, obscure functionality that you probably don't need. And even then and with all the damn bugs, KDE is the only usable desktop environment. I know because I was a tilingfag, and then realized using a bare wm is fucking retarded.



> Windows 10

> simplifying your life

More like getting cucked by your own computer.




It only adds complexity to your life if you think about your privacy from microsoft. Otherwise he time you'll spend working around Linux, especially on a weak non x86 system, will be more than any update would take.



Redpill me on the Linux framebuffer. Is it a viable alternative to running a bloated X, why allowing to avoid being confined to pure text mode?


Is there anyone here using a oldish (U)SFF Dell Optiplex as a home server or NAS? Preferably without ME/AMT (though even the Core 2 ones seem to have ME, while anything older probably is Netburst based).



Yeah, why?



No it's not. Unless all you do is play mp3s and low res movies, read pdfs and browse the web with no javascript and masively broken CSS.

Friendly reminder that obsessing over "bloat" is a mental illness.



>He thinks loading a pozload of js makes him productive.



Yeah, I'm sure you're making 800k a year by editing K&R C with vim and then debugging it with gdb on a virtual tty. Faggot larper.



But anon, what does it mean when you say that it's not hard if you're white? Why are whites better at building software?



It really amuses me when webdevs give their insights as to what they believe programming looks like. Hilarious, thanks for sharing.


Look around you Sandeep, everything was invented by white people.


File: a0c4b27042db068⋯.png (5.95 KB, 653x454, 653:454, 10964.png)


Text mode is the final destination. Throw your movies and botnet away.



>Why are whites better at building software?

High IQ, immensely creative and inventive. An autistic Finnish teenager feeling bored can accomplish more than any one of the billions of Asians. Fact.


File: 13cc890f5368d37⋯.png (1.61 KB, 800x480, 5:3, znake.png)


Not him, but I was making good money exactly like that. Had a Thinkpad with 4:3 screen. Ran tmux on the console to have as many local ttys as needed, some in split-screen mode. Shit was cash. But times change and companies get cucked and decide to move their shit to Java/JS. It was just too good to last. Now I'm a weird antisocial NEET that collects text games.



Oh, tell me then how development looks like for you without using Xorg. I'm genuinely curious.


The idea makes sense for me. I don't think it's about bloat in the technical sense it's more about bloat in the psychological sense. A clean environment free of distractions always makes me more productive. Might be an idea to have some normie device to consume movies and web and the framebuffer system for work and other serious computing. Clear distinction between two different systems might help further psychologically.




You can buy an actual good desktop with a Pentium or better from HP, Dell or Lenovo with that money. You dumb nigger.



>what are command line tools



So basically gcc, vim and gdb?


>not metnioning any tool in specific

Larp confirmed.



Just make a wm that fullscreens everything.




I use Devuan ASCII (Systemd-free Debian) with Linux-Libre using FVWM as a window manager, and use several suckless programs (not including their coreutils thing, fine with GNU atm). I use Surf as my primary browser, and most things are done in a UNIX-y way (one example is I have no login manager and use a series of X11 conf files with `startx` (if I even use X)). I feel my computer is semi-minimalist due to this. I higly recommend you adopt what I have as a beginning to a greater minimalist system.


File: da03ab804a7f96a⋯.png (8.43 KB, 800x500, 8:5, qodem_x11_kermit.png)


It was actually pretty common in the 90's to just do everything from the Unix shell, even if you had a Windows or Mac computer. Basically those were just used to run terminal software (like pic) or some telnet or ssh client, and you'd do everything on the server. And that's pretty much how I did most of my work even up to 10+ years ago. And it wasn't just C stuff, lots of Perl and SQL as well (backend stuff, I didn't like messing with PHP so much, but did it relucatantly sometimes).



I use a sway/wayland on a pentium D 820 and 4GB DDR2 RAM. it pushes 1080p and is comfy.


You'll never need over 256Kb, anon. Bill gates said so himself. Look at TempleOS, it is fully functional and a middle finger to modern computing and it is literally the most minimal. Fucks make bloat a requirement now.



Your problem is that you are trying to do everything on your ARM processor. The solution to said problem is setting up a distributed system so that your ARM computer/laptop is just a graphical front-end to your system and all your computational needs are met on another system. This isn't minimal if you use this by yourself, but if you get some friends you can all use the same CPU & file servers from different access points. This won't be sufficient to run "muh gaymes", but it's plenty for everything else.



Plan 9 is good for what this anon is suggesting



Using text interfaces exclusively seems to be the only way to really use just wayland and not x11



That was because unless you were running Linux, you didn't have access to a local Unix environment. Cygwin, Mac OS X and the native ports of Apache and other software to Windows and Mac didn't exist.

So you had to make do with what you had. And even then, you see that some people used ncruses IDEs, file managers and other tools with as much features as it was possible to cram onto a text based UI.

But the kind of people who in this day and age wouldn't run an X server is against those things, because ncurses interfaces cannot be called from scripts, and again, because of their mental illness they're against any code that isn't absolutely necessary to perform basic tasks (without any regard for real-world efficiency, as opposed to the Unix ideal of efficiency of piping one minimalist program to another to build bigger tools, that doesn't work in the real world).



Literally just use an old pentium thinkpad running XP. Despite the p4 supposedly being a space heater, my laptop has a battery life of 4 to 6 hours with old batteries. Its definitely low power. Its not very bloated, not botnet, and not a power hog.

>muh 4k

>muh 16:9

Gonna be a yikes from me. Why are you asking for minimal shit then you want some dumb zoomer technology? Honestly just use a notepad and a pencil and maybe you'll stop being a braindead screen zombie and use your brain for once rather than mindlessly editing linux config files.



>caring about screen ratios because other people on the internet told you to

maybe you are the brainless zombie



There is good reason doe



I'm cruising at around 36W on a 10 year old Core2Duo MacMini. I don't really need to do anything more than 1080p and the hard drive has been switched out to a more modern 500GB. I've given up on the ARM meme. Nvidia is the only company that makes boards with decent GPU drivers and they have no incentive to keep updating old hardware. I'm sure PowerVR is really slick on Android, but at the end of the day even a shitty Intel GMA950 GPU is better for running GNU/Lunix.


File: 5ba621af2f32c93⋯.png (24.01 KB, 793x573, 793:573, weatherspect.png)


Well there's Midnight Commander, you can do just about anything with that. It even lets you transfer files over ftp or scp, or directly edit files on other computer. You can also setup custom menus and keybindings. Like for example, I have a user menu (F2) entry to run diff between panes:

d diff the selected files (or dirs)
%view{ascii} echo diff -urN %s %D/%S

I used to have CVS entries in there and other stuff too, back when I was actually using it at work. But even now I always have an instance of mc open in screen or tmux on my local computer. It saves me a whole lot of time compared to doing everything from the shell exclusively. And I dont need any GUI file manager, this does everything.

For a few years I was using vifm (v0.5), but then it got all bloated and I didn't see the point of using it anymore. Instead I just configured mc to have vi-style keybindings.


I tried to get into various ncurses-based filemanagers all the time but found it just quicker to use the shell directly. I don't spend that much time shoving files back and forth, to be honest. If I have a lots of file to sort or something, the problem is usually better solved in a script than in mc, although I'll always have mc installed since it's just so immensely familiar from the DOS days and it would feel wrong somehow not having it.

I have an AMD Kabini system, integrated graphics and 4 cores that are about as fast as a core2duo core on steroids.

Can drive 4k@60Hz just fine, via it's display port. Also have 8 GBs of RAM for bigger compilation jobs. (running Gentoo) the Board (soldered-on SoC included) was cheaper than some of the ARM boards. Downside is it needs tons of firmware blobs loaded by the kernel to work properly. Might or might not be botnet, it's too old to have PSP included though. Also you need fans. The ITX-Board came with a passive Heatsink on the SoC but will overheat when it's just a little bit warmer than normal. The board burns about 30-40W depending on load. It also comes with nice stuff like USB 3.0 ports and 4 SATA ports and all the luxuries ARMs usually don't have.

ARM is a meme and will not be good for general computing for a while. It's good with android and proprietary drivers, but that's all. Some older SoCs are mainlined but they're still terrible to set up and way too expensive for what they deliver. I feel you OP, I was looking into that direction quite often too. It's just not worth it. Maybe in a few years but as soon as ARM gets good enough to be usable for serious computing and also not dependant on blobs it will end up having shit like PSP or ME. Quote me on it if you want.

For no-X living:

YAFT: A framebuffer console which gives you 256 free defineable colors to make things a bit prettier. Recently it started to support sixels: https://github.com/saitoha/libsixel which basically is a format to print graphics directly to the console, including gifs, which is pretty cool and even usable in scripts. It's kinda slow though, as it doesn't use hardware acceleration.

DVTM: a "tiling window manager" for the shell if you will, be aware it's kinda buggy though. tmux and sorts probably works better. It focuses on only being a tiling terminal manager in the UNIX way and doesn't try to be dozens of other things like screen and tmux. For additional functionality (session attach/detach) there's abduco/dtach.

These are two, haven't felt pulled to a non-X living though as X works fine with the open source AMD drivers.



It is possible to keep productive like that if done in a sane way. For actual writing JS, it doesn't matter what you're using. As long as you can bribe your "team motivator" to give you a chance and as long as you are able to setup X and whatever elso software it depends on to test it.


File: 53d8a6e6e45ddbf⋯.png (14.4 KB, 320x200, 8:5, grandad-and-the-quest-for-….png)


You don't need any special driver blobs to run Firefox or LibreOffice on Linux/ARM. It just works. Even watching videos is fine, there's no need for blobs there. The Allwinner GPU blobs on sunxi are for the actual OpenGL acceleration, not the video decoding. I don't know what max resolution it can push, but I don't care either since I have an 4:3 display and intend to keep it that way. For sure you won't play fancy modern games on these boards, but I also don't care about that either. There's too many fun old games for me to give a rat's ass about pozzed modern shit.


It's funny how nobody realized yet that anons precious 4k screen will just run fine in 1080p with no image degradation as 4k is literally 4x1080p

no arguing, 4k screens are fucking nice if HiDPI (depending on physical size) especially with well aliased and hinted fonts and high-res pictures of course look nicer on them. Doesn't matter for console livin' tho, as console fonts are bitmapped, even with fancy framebuffer in between. They'd just increase in scale, basically, which you would have to do on a 4k anyways, as the normal console fonts used are fucking tiny in 4k.

The only way he'd be fucked is when his screen does bicubic interpolation instead of nearest neighbor for scaling. That would look like shit, yes.


I'm curious, what kind of Allwinner do you have? From my experience they all run like garbage with browsers in Linux. Are linux browsers still so broken that they don't use any hardware acceleration? Man that's sad.



t. somebody who hasn't tried it

>what is bicubic interpolation



It's just an A20 (dual core 32-bit Cortex-A7 @ 1 GHz). There's no special hardware acceleration in browser, and anyway I don't want to play games/watch video in browser. All I need it for is online banking and ordering things, and stuff like that. Otherwise I use simple browsers like Lynx, Links2, w3m. For playing videos, I use mpv, it works fine even in the framebuffer console (just like pretty much any SDL program).


File: 9ca082af8f01403⋯.jpg (84.17 KB, 960x720, 4:3, Scaling Images Unlike with….jpg)


Bicubic is a bit more complex but same difference, will look like shit with bitmapped fonts. You want next/nearest neighbor.


intrigued, do you do any serious work on it, too?



The dummy terminaly (Sandy Bridge Xeon with 8GB of RAM) can barely drive the 2 4k monitors I work on at work when all I do is SSH to a Haswell i7 to do all of my compilation. Why the fuck would I ever use shit minimal hardware when my job revolves around compiling software that takes 30 minutes to compile on relatively modern hardware like a 4790k?



No, I'm just a NEET now. But even when I did serious work, it wasn't anything that needed more than a text interface as I mentioned in >>1018665 >>1018780 >>1018802

And compiling stuff on A20 is of course not as fast as amd64, but I'm a madman who built NetBSD kernel on a class 2 microSD card that I had laying around (I didn't want to nuke my Devuan install, so used this spare card instead for playing with NetBSD). I'm working on getting a power supply now that's strong enough to attach sata disk to my board (it does have a sata port). That will be much nicer.



100k but yea this.



>complaining about widescreen being "shorter"

>when the most common monitor size for 4:3 never went above 17" (~9.5" vertical minus bezel)

>and the most common monitor size now is 27" (13.4" vertical)


Widescreen is retarded marketing crap and we are poorer for being stuck with it. Yeah you can put it in portrait mode but with a just somewhat large monitor that's fucking useless and you know it.

Bigger is better. Always sells in marketing. Doesn't really mean shit. I think 20-21" 4:3 is the human comfort zone for a monitor, with 16:9 you need 27" if you sit right in front of the monitor, else you either have to put the monitor at a weird distance or sit in an unhealthy way. Stuff in the corner of the monitor will still be at an angle and backlight bleeding (which can't be avoided fully) will be more noticeable. It's an unsolvable problem with the way human eyes are placed in the head and 16:9. Go ahead, test it, I'll be waiting.

Now test the same with a good 4:3 21" screen. 1600x1200 preferably. (high resolution is always better because of better pixel density, although there are diminishing returns with screen size) everything will be straight ahead, you can sit comfortably close to the monitor and backlight bleeding on dark backgrounds will not be noticeable. Just works. Nobody needs that 16:9 shit. nothing works well with it. You either clutter it with windows which is super distracting or you live with having huge empty spaces.

Not to disregard all new technologies because modern panels with LED backlight and what not are objectively superior, just the form factor is retarded. Nobody will ever make a 4:3 screen for a commercially feasible price again because it would be smaller than competing 16:9 and you can't sell that to normie.



Why not the Pi 2? It has the same CPU, just four of them.


Nah I go way over the top, most of the shit I do outside of gaming is run on my server


If you want real minimalist, the best way is to set your limits and just work from within there. One thing I did was under $1000, and calculating how much ram I would really need. I don't do gaming outside of some MAME/FBA emulation, and so I didnt need anything more than 8gb. If you're talking about software, well there's a lot of options, suckless stuff works best for me



that's the point you fucking retard, you can't change how your monitor scales resolutions, it's hard coded into the firmware. the driver doesn't offer an option to do nearest neighbor either, but that would require OP to pump 4k to the monitor anyways



>Why the fuck would I ever use shit minimal hardware

because of mental illness. that's pretty much the only reason


Atom notebooks, 200$ and you're good.



I wanted a board that can run without firmware or driver blobs. The RPi unfortunately needs one to boot. Allwinner also has some quad cores and even octa cores, but the support there isn't as good as A20 yet. The octa cores also basically need a fan for cooling, so meh.




I used gentoo on a 2010 atom netbook with 2gb of ram for a year. My main problem was playing 1080p mkvs. I compiled linux and firefox no problem, just start it before bed. Icewm for wm. Firefox ran great with 20+ tabs as long as I used noscript to only run scripts I needed to run. about:memory if it did lag up always fixed it. Great umpc setup if there ever was one. 1080x600 does get old tho.


I'd say just get a 4:3 monitor with a good resolution but I checked eBay, and they cost as much as used 27" 4k monitors. I guess there really is a demand. (for monitors at non-retard sizes/shapes)


there used to be expansion cards for notebooks with a broadcom chip that did video decoding, they also have linux driver support. Don't ask me what the current state is though. Also would've meant you have to drop the Wifi card.


I read somewhere you can compile the blob for video yourself now, but I don't know the details. Might not be true. 1 GB, 100 mbit ethernet and no SATA etc. is quite limited anyways.


You can have lots of fun directly interacting with the framebuffer too. The framebuffer is a file, like everything in linux, a few examples:

For shits and giggles, switch to a virtual console and try this:

(fb0 is the usual framebuffer device, might be different on your system if you have several graphics cards or something)

cat /dev/urandom > /dev/fb0

Should fill your framebuffer with colored snow.

cat /dev/zero > /dev/fb0

clears it

cat /dev/fb0 > /tmp/screenshot

gives you a raw screenshot of the framebuffer and

cat /tmp/screenshot > /dev/fb0

puts it back.

ffmpeg -i [path to some picture format ffmpeg understands, or video, or gif] -pix_fmt bgra -f fbdev /dev/fb0 -loglevel quiet

(loglevel quiet because ffmpeg is quite chatty and will overwrite what it's putting out)

will put a picture/video on your framebuffer. The video will be "played back" as fast as ffmpeg can transcode it, so mplayer is actually more useful for watching videos.

Framebuffer format on modern hardware is usually 32 bit RGBA, but there are kernel interfaces for querying it. With this it's trivial to write a small program that puts a small widget like a clock on it, for example. Or write your own screensaver in bash. Be aware though that it's not hardware accelerated and quite costly, pushing a lot of pixels will be slow. Ideally you want to write to an offscreen buffer and just push changed screen regions. This shit used to be trivial knowledge. Not anymore, that's why I mention it.

As we can see, as the linux console is actually supportive of truecolor you can easily change the 16 colors you get, as simply as

echo -e "\e]P{number of color, hexadecimal 0-F} {hexcode of color, RGB, NO #}"

set your favorite ricing scheme that way.

You can also change the font, now this part is a bit arcane but the information is out there, so I won't describe it fully. Look into the psftools. A lot of distros actually come with a lot of fonts for the console already, you can usually find them in /usr/share/consolefonts/ you can also download fonts from 10inth.org and easily convert their .fon file into psf fonts with psftools, they're very readable and were made to be stared at for a long time, some are quite comfy and easy on the eyes. Setting a font is as easy as setfont {path to .psf}, showconsolefont prints it's characters.

For completeness sake, the proper way to interact with that shit lowlevel now is the DRM interface, but it's a lot more complicated and poorly documented, like everything new in linux.


File: ed3a12d5144a5eb⋯.jpg (308.95 KB, 800x948, 200:237, 171343-devil-s-crush-turbo….jpg)


I checked into the RPi blob thing when I was researching boards, and stumbled on a RE project that was making a replacement for the VideoCore blob. But they never finished it, and it's missing important stuff like framebuffer and USB. Maybe the board can boot and be used for very limited purposes (like say a headless server), but that's not enough for me. Actually they mention it here:


> Boards based on the Broadcom VideoCore 4 family, such as the Raspberry Pi, require nonfree software to startup, although signature checks are not enforced. A free proof-of-concept replacement firmware has been developed, but it is not in a usable state, and development has halted. Until the nonfree startup program is fully freed, these boards are useless in the free world.

Anyway, yeah the RPi is kinda gimped in those other areas. My board has 2 GB RAM (at a higher clockrate than the RPi), GigE, and a real sata port. Even the power jack is a real one that can accomodate more current than those micro USB things (good enough to reliably power a laptop HDD).

And I actually have a 4:3 monitor (15 euros from local seller), but run it at 800x600, even though it can do much more. That's how I ran my CRT in the 90's, and my Thinkpads later on, so I'm used to it. Plus less pixels to push means it's faster. I just don't care at all about high resolution stuff, and that's why this board does everything I need.



Interesting, thanks. Can you explain how to control what mode the consoles will operate in (pure text/frame buffer/KMS), how to select the specific frame buffer driver, and how to set specific parameters (resolution, bit depth, refresh rate, etc.)?



Depends on hardware and kernel. Some hardware doesn't have a text mode, that's more of an x86 thing. There are tools like fbset to change resolution, but those don't work on my ARM board. Instead I had to do it from the u-boot prompt:

setenv video-mode=sunxi:800x600-24@60,monitor=vga,hpd=1,edid=0

In theory, you should be able to set it from the bootargs variable that's passed to the kernel when it reads /boot/boot.scr, but that didn't do it for me. Had to outright set it in the u-boot environment itself.

This is what I get when booting:

[ 0.000000] Kernel command line: disp.screen0_output_mode=EDID:800x600p60 disp.screen1_output_mode=EDID:800x600p60 logo.nologo console=ttyS0,115200 console=tty0 consoleblank=0 root=/dev/mmcblk0p2 rootwait panic=10
[ 0.050398] simple-framebuffer bfe2b000.framebuffer: framebuffer at 0xbfe2b000, 0x1d4c00 bytes, mapped to 0xf0900000
[ 0.050452] simple-framebuffer bfe2b000.framebuffer: format=x8r8g8b8, mode=800x600x32, linelength=3200
[ 0.059456] Console: switching to colour frame buffer device 100x37
[ 0.067865] simple-framebuffer bfe2b000.framebuffer: fb0: simplefb registered!

Then only /dev/fb0 exists as the video device. No /dev/drm or whathever KMS uses.



How is a tiling WM better than framebuffer console with screen/tmux?



Thanks. What about the

kernel parameters? Do I remember correctly that
will set the kernel to use a pure text mode console (if supported), while specifying modes with any of those two parameters will enable framebuffer console?

How is /dev/fb0 interrelated with console devices? I typed

sleep 5 && cat /dev/urandom > /dev/fb0
and switched from tty1 to tty2, and the "colored snow" was output to current tty (but error message "no space left on device" was output to tty1) - does that mean that /dev/fb0 is associated with /dev/tty0 (i.e. whatever tty is currently active), while the message went to /dev/stderr which goes to /dev/tty, i.e. the tty where the process was invoked?



Whoops, this was meant as a reply to >>1019158.



Why no /dev/eth0 so you could put raw data directly onto the network interface?



>Should fill your framebuffer with colored snow.

Why does the effect look very bright, almost white (when squinting your eyes or looking from a distance)? Why is the chroma effect (random colors amalgamating to white) stronger than the luma effect (random colors averaging to mid-grey)?



Interesting question, no idea. Maybe a function of how your eye works/your screen works?


You are basically always in the framebuffer console with modern kernels and common options compiled in. Text mode resolutions are very limited in regards to modern screens and like it has been said, it's an x86 thing, some platforms don't even have it. (Classical, the old 68k Macs are an example - remember a lot of this stuff has legacy reasoning)

All the virtual consoles share the same framebuffer, it just gets cleared and redrawn on switch. The framebuffer is an abstraction layer that lets you directly interact with the hardware, you're talking directly to the kernel else. No idea how that modesetting/DRM stuff works (apparently better) that was a bit after my time.


Short answer - legacy reasons. That Unix hating guy that posts here should have some words on that if he's not only larping.


It also has to be said that using the framebuffer doesn't preclude you from also having X running. You can do usually both at the same time and with KMS switching between X and ttys shouldn't have any real delay.

My reasoning isn't really computational bloat either as X even runs fine on machines 20 years old and isn't too heavy for them, just don't add current-day GNOME or other on top. No, the point of living a text only existence is simplicity, reducing psychological bloat if you will. A set way in how the screen is put together, a set amount of colors, a set font everything is written in. It sounds so simple that it's stupid but it actually really helps!

Also cutting out the internet, social media, and following of news. Having no computer capable of doing that stuff easily can be very helpful. A few weeks after living that way I started drawing. Yeah, drawing. Have never drawn before in my life. I just felt creative like that. I've also started writing my (curses-utilizing) game I've always dreamt about doing but somehow never found time for, time I apparently wasted browsing shitty websites and looking at stuff passively. In regards to social media, I only come here now anymore, and only on select days.



To add to the drawing thing: Before I started living this way I would have probably started browsing for drawing tutorials on youtube/the wider internet, would have seen people that are 100000x better than I ever will be and probably would've never even bothered starting.



What are the (usual, at least) interrelations between the following:









It seems that:

>/dev/tty[1-6] are the six separate virtual terminals which are usually available

>/dev/tty means the terminal where a given process was started from

>/dev/tty0 means the currently active (displayed) terminal

>/dev/console is a character device where the system prints diagnostic messages to, normally it is connected to /dev/tty0 (i.e. the currently active terminal), can be changed with "console=" kernel parameter?

>/dev/stdin, /dev/stdout, /dev/stderr are all linked (via /proc) to the currently active terminal, unless redirected elsewhere with something like 1>, 2> etc.

Please correct any mistakes (if any) in the above.

Regarding /dev/fb0, I ran the following:

dd if=/dev/urandom of=/dev/fb0 bs=1
which made the output slower (every byte a separate I/O operation?) and switched ttys while it was drawing lines of random pixels from top to bottom - upon switching the tty it was continuing to draw in the currently active tty, but from the line where it left of in the previous one. What does that mean?



Btw, can "ls -l" be made to traverse symlinks all the way to the end? For instance, "/dev/stdout" is a symlink to "/proc/self/fd/1", but that is again a symlink to "/dev/tty[n]" (where n is the number of the currently active virtual terminal) - is there a way to an ls on /dev/stdout so that it shows the actual file at the end of the symlink chain?



no i use windows 7 like a man



Note sure if there's a built-in way but if not, you could easily wrap with or pipe into a script that reparses links before printing. They can be identified in a number of ways, with -F probably being your best bet.


How to connect a virtual console (such as /dev/tty1) to a serial console (such as /dev/ttyS1) to get a console connection to device connected to a serial port?




>Cumskin really believe that

Ahahahahahahaha,chink are superior



Not sure about posix but GNU ls has -L for dereferencing symlinks, which you would know if you read the fucking manual.


ttyN are connected ttys. There are no actual ttys anymore, so they're all virtual. You will normally be dropped onto tty1, from which you will log in and do what it is you do. Display managers might also start on tty1 by default. devtmpfs seems to have 64 of them for good measure.


You don't connect a tty to a serial tty. You start something like agetty on ttySwhatever and plug your vt100 into the corresponding serial port on the machine, just like you would run agetty on tty3 to get a virtual terminal there. Have a look in your /etc/inittab.

If you just want to talk to a device on the serial port: cu, or screen, or whatever the hell you please. As much as UNIX pretends to be everything-is-a-file land, it's all a lie. Everything is special and requires special programs.



>plug your vt100 into the corresponding serial port on the machine

What I meant was the other way around, i.e. not using a hardware terminal to connect to a system running on your computer, but rather to use your computer as a terminal connected to another device (like a router or switch etc.) which has a serial/console port.



I use cu in OpenBSD laptop to connect to ARM board. It's part of the base system (/usr/bin/cu) so makes sense to use it. There's also package for Linux. Or else you can install Minicom. Basically I use it like this:

cu -l cuaU0 -s 115200

But device name will be different in Linux. Anyway it uses ~ as escape character, so when you type this symbol at the first character on a line, the next character you enter is assumed to be a command. For example, ~. drops the connetion. Man page has all the possible commands.

If you want something more fancy, try Minicom. Allegedly screen (GNU Screen I guess) can do it as well, but I didn't try this.



>GNU ls has -L for dereferencing symlinks

It doesn't say what the target file is though, it just pretends the symlink was the target file (i.e. for /dev/stdout it will just say it's a character device, still obscuring the fact that it's indeed a symlink to /proc/self/fd/1, which is in turn a symlink to /dev/tty[n] which is the actual character device). The idea was to have ls reveal what the file at the end of a symlink chain is, -L doesn't seem to do it.


File: 11979058da5e54a⋯.jpg (26.62 KB, 600x600, 1:1, PL2303HX to USB TTL.jpg)


Forgot to mention, cuaU0 is the local USB serial device. I got one of these plugged into my laptop, and the leads to into the ARM board. Careful with this, you probably don't want to connect the red wire, unless the device actually requires it. Details here:


Oh, and TX line must be connected to RX lead on board, and vice-versa.



Nobody told me anything you mongoloid. I don't want to scroll almost twice as much for working on spreadsheets, code, documents, or webpages. It's like looking through a slitted window.



nice LARP


>implying SystemD isn't the ultimate cucking

At least win10 is upfront about what you get



Ricing is important to me. Your environment has legitimate affects on your state of mind. Windows ricing is a sad hack.

Windows' only benefits apply to people who are apathetic towards their desktop experience, or want software not working on other systems. Those are good reasons, but neither apply to me.



No they aren't. Windows 10 did that shit when instructed not to by user settings.



No they aren't. Windows 10 did that shit when instructed not to by user settings.



Shit bait


File: 2bd36fc7b7d6f30⋯.jpg (88.35 KB, 540x675, 4:5, Card_crusher_psyco.jpg)


10 year old OC posten



Right now? No but I am looking to do so. Some of the Pine64 and ODROID boards have "good enough" (4GB RAM, multi-CPU, etc.) specifications that they could be daily drivers for everything minus GPU-intensive tasks (and 144hz :( ).

One problem is that many of the most performant ARM based SoCs also suffer from speculative execution exploits (I realize there are mitigations but it would be ideal to spend money on hardware that is more secure).

Another problem, which you mentioned, is that many either require specially-configured kernels or screwing around with a bootloader to get a non-supported distro to work on them; however, this isn't as difficult or deal-breaking as it may seem on the surface.

I've been contemplating this idea (utilizing more minimal hardware in terms of power costs) for a while but still no closer to a definitive decision. Any anons have some suggestions?


you need some special kind of autism for that. systems that are accepted as minimal here are annoying and hard to use for most people.



why would you do that? usually those have a really low res screen so thats just a waste of resources. just get them in 768p instead.



Most people are idiots. There's also a special kind of psychological value in such a system, I think. I know I got more creative and productive every time I limited my environment somewhat. Less sometimes really is more.


File: d744c03c3ff153b⋯.jpg (103.65 KB, 800x533, 800:533, xterm.jpg)

What I would love to do is have noiseless, calm, X Terminals everywhere, and have my Workstation somewhere like a closet. I've got the thin clients that could manage it.

What's holding me back is having to give up video playback. I've tried it, but a 100GB LAN w/ client video codec support can't ion anyway manage 1080p, and the setup is a PITA, and you can forget about audio/video staying in sync.



I tried that in-house steam thing over the network to a linux box in 1080p and it was good enough to play GTA5, but I had to limit the framerate to 30. Cutting the framerate in half helped the weaker client system a lot. It was playable. If you utilize in-hardware video en- and decoding it should work but yeah, it's generally a rough experience. It's one of the things you'd expect technology to do just fine in 2019, yet it doesn't.


You go with a weaker CPU or you live with the possibility of these exploits, can't really have both, else these exploits wouldn't be the huge problem they are.



Sorry if this sounds dumb but couldn't you basically just have an extended cable fpr the keyboard and monitor going to the workstation rather than having a thin client over LAN?



Sounds interesting, anything more on that? were you connecting via serial to another machine (given that kermit seems to be used for that)? If not then how would the connctions work (both physical and software)?


File: 70dedd2b38f4032⋯.png (896.6 KB, 1280x717, 1280:717, aho.png)

I'm running the [bloated] KDE plasma under 7W power consumption.

I removed and disabled the display of this laptop since it's already broken not quite only 1 vertical line but I have the DVD drive connected right now so it could go even lower if I remove it too. My external display is energy star and ~18W on normal usage. Goes on standby mode under 1W or less.



>rPIjew rant

>ARM rant

>not using quadcore intel atom like acer d270 bios-unlocked to work with 4GiB RAM

>not using a 19v powerbank @ 2019

>parts like batteries, keyboard, mobo, display are still sold online. entire netbook is probably much cheaper than the rPIjew 480p HDMI display module.

>or not getting open source nexus or nextbook device and putting arm debian/arch or gentoo.


whats with all you fags and notebooks. I don't live in the ghetto and have a white man's house and I like to have a desktop computer like a white man, they're much more easily expandable and maintainable than some shitty old and used notebook where there are probably pubes from the previous owner stuck in the keyboard.


>I have a 4k monitor and sadly got fucking used to it. Should I go back to 4:3 screens /tech/? I just wanna simplify my life somewhat.

I have only 5:4 and 4:3 monitors mostly 1600x1200 or 1280x1024 and sadly got fucking used to it. My CRTs have graphics that actually look good and non-changing from any angle, have zero input latency, and good contrast and sadly got fucking used to it.


>The framebuffer is a file, like everything in linux

And just like a file it has no vsync so you can't even do animations properly (the Linux "framebuffer" API actually has a vsync signal but it's hardly implemented). DRM-KMS is much better.

>Ideally you want to write to an offscreen buffer and just push changed screen regions.

Yeah and you want to do it during vblank which almost never works on Linux's "framebuffer" API (because all the framebuffer drivers for vsync is just noop), so you gotta use DRK-KMS, which actually allows you to do basic stuff like setting real timings anyway.



CRTs are horrid. Also you apparently never heard of IPS panels or any other improvements on them that came afterwards. That angle stuff hasn't been a problem for a long time now. I can put my head against the corner of my LCD and look from there sideways to the other corner and all colors still look correct. Get with the times dude.




also it's a very common misconception that CRTs have zero input latency. If you don't believe it, google it. There's tons of information about that out there I don't feel like repeating here. It also makes no sense for any display technology to have "zero input latency". That's just not a thing with physics. Input latency is a meme anyways, for most things it lies way below of what a human can even percieve. Our eyes and brain are also not zero latency, in fact our brain isn't even that great at processing visual information and relies on tons of shortcuts and putting information snippets together to process what it sees. That's how visual illusions even work. Our eyes are not high-speed, 100% objective cameras that process and store every frame with perfection.

Most CRTs are in fact not significantly better than modern LCD screens geared towards speed.

There's really no reason to stick to CRTs these days expect nostalgia, a modern LCD is better in every way.


>he fell for the 4k meme

>at same time demands minimalism


fuck off jew nigger

LCD are trash, even premium ones from 2019

they have shit viewing angles, shit colors (and many LCD's have only 6bit per channel), huge input lag, slow response time, shit aspect ratios

CRT's still destroy them. and you are wrong about input lag nigger, CRT have less than one milisecond input lag, LCD's have 15-50ms input lag. it does matter.



Shit viewing angles and contrast is literally not true anymore. Stop being poor and afford yourself a nice screen from this century, kiddo.

Also yeah, you're totally gonna notice even 50 ms of input lag, idiot. There are actually scientific studies and even fit, young athletes have reaction times of > 200 ms, for a fat /tech/ lardass who sits all day on his ass and only shoves a little mouse around, it's going to be a lot higher.

Just get the information that's out there and actually start living in present day. Or not and huddle in the dark in front of your old-ass CRT. See if I care.



Exactly anon. Eyes can't even see at 50ms.


The best thing prove to these "muh input lag and frames" that the brain simply doesn't work that way is to tell them to recall something simple and then telling them to draw it.

Take your favorite console font for example. Walk away from the computer and try to draw three letters of that font. No real drawing skill necessary. Now try ten. Now try the entire alphabet. You've looked at that font for years at this point maybe yet you can't do it, because your brain simply doesn't store information that way. Visual input is not a constant stream of frames that your brain processes like some kind of computer from the top down, but a very complicated tapestry of parts of your visual field, memory and your brain expecting for things to be where they are without really processing the input from your eyes at all. That's why visual illusions work. That's why you can show a short movie of some event, ask three people what happened in that movie and get three different answers. Claiming you can objectively notice a difference between 5 ms input lag or 50ms of input lag is straight up laughable. But nice work falling for those particular marketing tricks. I'm surprised nobody claimed yet the jews did LCDs. That would complete this particular 8ch bingo.



You could also ask someone to play/sing all the parts of their favorite piece of music.


File: a179895b35c5d3c⋯.jpg (37.98 KB, 433x433, 1:1, 1530049294073.jpg)


>Walk away from the computer and try to draw three letters of that font.

tfw I can visualise it pretty well and could probably do the whole font.





>human eyes can't see above 16 FPS anyways lol




t. someone who's never compared mouse movement at different refresh rates or with X11 composition enabled vs disabled. Try using a mouse at 30Hz, then come back and tell me it feels exactly the same as 60Hz or ideally 75Hz. There's still gains beyond the 75Hz mark but beyond that point they shrink dramatically.


>200W of power consumption when I open a text file.

Quit using a Mac anon.


I just use old hardware. It's better, because everything modern is absolute garbage made for consumers who don't understand technology and are used to throwing it away every 2 years to "upgrade." This forces my selection of modern software to remain minimal and educated, so no electron shit.



Hear hear!



>I just use old hardware

How old specifically?



>Shit viewing angles and contrast is literally not true anymore

Since OLED for monitors doesn't count, you either buy VA with black crush and poor color accuracy or an IPS with shit contrast and the IPS "glow" (too bad A-TW polarizers aren't used anymore).


File: 8fd0d8406dc822b⋯.png (164.5 KB, 1500x2166, 250:361, this anon is either gay or….png)


>angry for no reason

U okay?

>probably baiting but can't help and think he's really just being serious

<old and used notebook where there are probably pubes from the previous owner stuck in the keyboard

You even know it so you're one to talk.

I'd bet my 2 cents that you're using discord.


>crt can't even render horizontal or vertical line properly

>eyelet can't notice the interlacing flicker

>earlet can't hear the

>sms or calls from afar or any modem signals distorts the display signals


>common misconception that CRTs have zero input latency.

This guys knows it. It's just those jews on Ebay trying to sell tards the shit they got from a dumpster dive or driving at the wrong side of street. tards probably think it makes them better on CSGO from what I heard

CRT length of the wire does shit it up so there's a huge latency already.

Meanwhile LVDS LCDs have shorter wires connected right next to the graphics chip, higher refresh rate and you can even view the specs using HWiNFO and tell that it's just 1-3ms output latency plus the keyboard input is around 1ms since it is directly connected on the main board not a 100Hz USB 18ms delay add more for every bus that is occupied by another USB device, wire length, SNR dropped frames from interference, plus shitty windows or linux kernel defaults for UAS traffic.

How can nostalgiafags even compete?

Laptops are perfect for programming and gaming also you wouldn't want the HDD retract from a power outage and say goodbye to your files or game progress and as a bonus one of the sector gets bad since the HDD is designed for consumerist trash unlike laptops that are designed so the consumer won't hate the manufacturer so they put rugged hard drives or SSDs in there.

>inb4 SSD on PC

Much worse. You can't even fix a dead SSD and why use a low power device over that?



>5ms lag or 50ms

Have you tried queuing a rts or moba game at a server situated at the other side of the globe?

100ms latency is already noticeable. 300ms is unplayable.

Now try playing with bots and get 0ms latency when playing a bot game.

Or better try programming on a VS products. Just try, I'm not telling you to use all that. Or you can even program a notepad that can simulate input latency and actually prove and know what you're talking about instead of just making conjectures and drawing conclusion without any real evidence aside from word play.



>he doesn't know the old anon post about that experiment with the airforce (?) where they tried to slip an image of plane for a millisecond on a video (?) and the guys saw it and even could tell the model of that plane.

Someone dig it up for me.



>yet you can't do it

You appear to have mistaken us for niggers.



You could and it would perfectly fine but you would need to put cables in the walls which is expensive if the house is already build and they wouldn't be able to sell something to you.



Literally just a fucking laptop, like 150W tops.



However if you have old-school murican double walls then just do that. It's as cheap as the cables and you only need one USB cable because you can simply use a USB splitter at the end.

I'm also not sure about how many meters HDMI will work fine but link says it will work for 15m:



>i use Lynx and text browser, and only command line software

>if I can't see the bloat, it is not there

Bloat it bloat, doesn't matter where is. When you abandon GUIs and replace a single-click action with a console command containing 30 arguments, then the bloat is in your brain.

The required additional energy to learn it, think it, type it, and the cost to provide that energy is much more expensive than the few processor cycle you save. Not including the time costs.



>Also yeah, you're totally gonna notice even 50 ms of input lag, idiot. There are actually scientific studies and even fit, young athletes have reaction times of > 200 ms, for a fat /tech/ lardass who sits all day on his ass and only shoves a little mouse around, it's going to be a lot higher.

you fucking dumb nigger, input lag has nothing to do with reaction time. you are comparing two totally unrelated concepts

you can easily notice input lag of 50ms



LCD is trash worse in almost every aspect to CRT. jews jewed you



Do we have 150W laptop PSUs?



>22'' CRT monitor

>only ~19.5'' actually usable/visible

>weighs almost 70 pounds, smaller desks need not apply

>takes a fuckton of space, again smaller desks need not apply

>theoretically goes up to 2048x1536@85Hz, but good luck reading any text in that resolution

>realistically only 1600x1200 usable

>even that is not that sharp anymore due to wear and tear it sustained over the almost two decades since it was manufactured

>colours off and tinted, trying to adjust it via the normal menu doesn't help much at all

>one of the colours became laggy and leaves a smeary trail on screen, making it useless for gaming or even movies with lots of fast moving scenes

>seemingly no way to remedy the above except maybe (just maybe, no guarantee of success) going into a hidden advanced service menu where everything is obscure (partially just hex codes) and any mistake might actually further break or even brick the unit

>uses up to 180W of power

>because of the above it emits huge amounts of head, doubles as a literal space heater in the winter, but not much usable during hot summer

oh whew



oh and

>one of the VGA interfaces is semi-broken too, sending no EDID data and needing you to fuck around manually with xrandr and/or modelines in xorg.conf to even get the display mode (resolution/refresh rate) you want to work


I use the Pi 3B+ as a desktop and it's fine. Things take a minute to load up once in a while, Firefox takes like 30 seconds to launch, but it's not bad.

Most of you faggots never even used an old Unix workstation in the late 80s or early 90s, those fucking things were a hundred times slower than a Pi in every way. Heck this thing has 4 CPU cores, in my college days the dual core SGIs were like a half million dollars when it was all said and done and they were like a thousandth as powerful even including the GPUs.

<ree the pi is slow

This kind of faggot attitude should catch a ban.



*dual cpu



I used one as my main server for a bit but I found it really unreliable - especially under load. For example, if I was encoding a song, while I was streaming a movie from it it would occasionally just lock up. Not crash or reboot just freeze which is pretty much worst case scenario especially if you don't have immediate physical access to it. I ended up just putting gentoo on an old laptop and ditching the pi.



Mine seems to work fine. Each revision of the full featured board has been a major speed upgrade though so compared to the earlier ones it's much better. They all benefit from heat sinks or a fan though.



>The only way he'd be fucked is when his screen does bicubic interpolation instead of nearest neighbor for scaling. That would look like shit, yes.

Pretty much 90% of screens do bilinear scaling. You want something better then it has to be done via the gpu driver.



>thinks taller/shorter means inches and not pixels

>thinks 27" 1920x1080 is taller than 21" 1600x1200



>27" 1920x1080

That's the mistake. 1080p is good for 24, with 27 you should aim for 1440p.



>two single apostrophes instead of one double



>a quotation mark is the same as two apostrophes



What about 1440p support under Linux and BSD? Is it supported in all distros from the past decade or so without hassle, or should I generally prepare for issues, hope that my specific combination of monitor, port type and version, cable type and quality, graphics card, driver, xorg version etc. etc. will work, and pray?



See that's what I mean. Just look at all you fucking shitheads! Muh HD, muh 144, muh widescreeeens are like eyes! 1440peepeepoopoo



>dae u fukin shiteds poo poo

This is how you ensure your points never get across, how you alienate everyone around you and how you end up as a sad, lonely mess.



What's wrong with 1440p though? It gives you much more screen real estate and seems well-suited for 27" screens (much better than 1080p anyway). Only matter is if it's as well supported as 1080p without having to expect jumping through wieird hoops to get it working, that was the question.

>muh 144

Don't care, 60-75Hz oughtta unironically be enough for anybody

>muh widescreens

Good luck finding a modern (i.e. manufactured in this decade) display which is not widescreen



you just use cvt to write the fucking modeline for you, you spastic



>you can automatically create modelines without any information whatsoever from the monitor what modes it supports

That's why it's stuck in 640x480@60Hz without EDID information?


If your monitor gives proper EDID then none of that is an issue. 1440p is a terrible resolution, because fonts (or rather, the software rendering those fonts) don't scale well to it. 4k worked better for me.

Widescreen is fine and actually works pretty well if you can part the monitor in the middle. This might or might not fit into your workflow, some people find it distracting. Two terminal windows side by side are often helpful though. With widescreen of good resolution, you do not need a second monitor.



>If your monitor gives proper EDID then none of that is an issue.

Issue was that one of the VGA outputs doesn't send EDID for whatever reason.

>1440p is a terrible resolution, because fonts (or rather, the software rendering those fonts) don't scale well to it.

Why would you want any scaling? Goal is to use native 2560x1440 with a larger screen area than 1920x1080, not to scale everything to end up with the same effective screen real estate.

>if you can part the monitor in the middle

What do you mean by this?

>Two terminal windows side by side are often helpful though.

Many windows tiled on the screen is very good to have, hence the need for a larger screen area (you can display many terminals, an IDE, a file manager, ebooks or other digital documents, a video player etc. all at once).


File: 23c1d01e6c91682⋯.png (709.05 KB, 1920x1080, 16:9, s1934.png)


EIZO still makes quality 5:4 monitors




What's the resolution? Please don't tell me it's 1280x1024.




>$1000 monitor

Guess I'll stay on widescreen.



Eizo has an 26.5" 1920x1920, a genuine 1:1 square monitor which honestly sounds interesting because you'd be done with excessive scrolling and also would avoid all the whitespace you often end up with with 16:9. Have seen it go on ebay for ~600 bucks sometimes. Their 5:4 and 4:3 screens are the usual resolutions and sizes, just with the difference that they come with updated technology, like IPS panels, LED backlight and DisplayPort connector so you can Daisychain several.

Dell has a cheap 5:4 1280x1024 LED Backlight&DisplayPort Monitor (~120 bucks)



Because of the LED backlight it's also super energy efficent, like 10W usage. Just have to be aware that it only comes with VGA and DisplayPort. They're meant as sort of a "refresh"/replacement for older POS-Screens and not advertised much. I know about them because I had to set up such a POS-System and got surprised to hear they're even made/sold in current year.



IPS is shit



The difference is that the software was optimized for those machines, and in their prime were THE most powerful. Besides, you have to take the enteprise and professional features into account, like infinite uptime and stability. A Pi is not only slow compared to contemporary computers, but also excruciatingly uninteresting and boring.


I've got a 24" 4k monitor and at the TTYs I get 120x65 cols/lines *per* screen half (split with GNU screen) with a nice and very readable font I made myself. In fact, everything is nicely readable as the screen has 183 PPI, hell you could even switch to one of the inbuilt fonts (which are tiny at this resolution) and it would still be perfectly readable. The screen is driven without problem @60Hz by an 8 year old AMD system via it's integrated graphics. Since it's not some korean garbage screen, it has a nice anti glare coating and wonderful contrast/decent factory sRGB color calibration (with the option to calibrate it in-hardware, an option I never used since I don't care that much, I don't do graphics or photography)

If any of you autists had actually any idea what you're talking about you'd realize what garbage your 20 year old screens are at this point, but you don't so you haven't. But well you are autists so changes (or sudden movements) scare you. Enjoy larping with ancient crap.



Not yet, but I have been playing with a CLI setup to see how far I can go (mostly because it is light, makes transition to other weaker processors easier), to cover office tasks. I am aiming for an alternative to Windows with Microsoft Office or LibreOffice installed, without dependency on GUI. The obvious answer to Office is LaTeX (with beamer or other packages), both for speed in writing mathematical equations, but also because it outputs files to pdf. I am also taking into account the file formats widely used in the industry, the conclusion seems to be that I can create files accepted in the industry, but I can't colaborate without relying on LibreOffice. I am not aware of colaboration happening with artworks with photoshops, those formats are discarded. So, these formats (and protocols) are:

doc docx https://tex.stackexchange.com/questions/8836/producing-doc-docx-from-latex

xls xlsx ???

ppt pptx (just use pdf for presentations instead)


Microsoft Exchange (fetchmail, smtpd, and mailx)

I have yet to find a proper alternative to spreadsheets, so far it seems to be R since it can export data into xls and xlsx format.

There is more to consider, though, the trend is still favorable for web as an universal interface for any number of tasks - which is becoming the real stumbling block here, so it is advisable that you have a blink or gecko browser, UNLESS the service in question provides an API interface giving access to its functions, then you could write your own "browser" specific for that service.




just get something with an atom cpu. buying some high end gaming hardware for text editing would be retarded.



>obvious answer to Office is LaTeX

fucking lol




How does this prove anything? The brain processes text not as images but as meaning, and that has nothing to do with being able to notice an event that takes less than 50ms to occur. People can notice a single odd frame in a movie sometimes, and 1/24 s is less than 50ms.



what I love about this is how they advertise hardware features that aren't even usable without non-free blobs, if even at all in recent kernels.



>this argument

"Who are you going to believe, my science anecdote or your lying eyes?"



Don't diss it until you try it. Writing math formulas there is fast, and the goal is to have a toolset that is as capable, or more, than the software found in the common Windows computer. If your issue with CLI is the plethora of different commands on a distribution, then you are trying to adapt to an ever changing set of tools, you won't learn anything that way. Take the POSIX standard and stick with it, and you won't have that problem (as often, since distributions are trying to leave that behind, because the modus operandus of the FOSS movement is to copy commercial software in the hopes that it will be replaced, see Unix and LISP machines).



Are you sure?



tfw when minimalists are indistinguishable from consolefags



Sorry anon but math formulas are for idiots. I get that you are some fag student who thinks they've discovered fire but lmao you are not only wasting time using Latex but you are also writing math.


I use an x41 almost every day, probably my favorite designed computer i own. I use it for email and a few other things. running funtoo with fvwm mainly but also has windows 2000 and xp installed.

works perfectly fine - the 45 min battery life

I have a powerbook g4 I transition to partly, mostly for classic macos software support on tiger, but also for the same email as the x41. no battery at all for this sadly.

lastly I have an x60T that I use for artwork but krita and gimp are slow as fuck on it. Probably windows's fault but still I would really like it if there was a way to use it as a tablet for my desktop VNC is too slow and steam streaming is not good for what I need.

Anyway pretty much all of these pre 2008 computers are going to way outperform most arm chips.

also 4k is abslutely worthless and just makes working harder in most software. I really hate trying to move a single pixel even on my hd

if you want real minimal use old macos(pre 10.8 pref os9) or windows xp(or older) or use funtoo but without X and use framebuffer programs

even DOS can do pretty much anything you need if you are 'minimal'



You know there's something wrong when a 10W LED uses equal energy as an energy star CCFL



you can get a 1280x1024 monitor for $5 literally anywhere except the store that sells brand new calculator screens with shiny frames for $500-$1000



>it has a nice anti glare coating

I've yet to see one calculator screen anti-glare coating that isn't shit



That's because Eizo only sells "high end" shit. Meanwhile you happily forked over hundreds of dollars for a new widescreen which is either of

2006 era:

-16ms input lag

2010 era:

defective garbage

2015+ era:

you can only get a decent one for $700+

Literally the first widescreen I got in 2012 was a complete piece of shit. Decent contrast because VA panel but everything else about it sucked monkey balls. Even the anti-glare layer was somehow 10x worse than my previous calculator screens that broke down.



It's also unreviewed and probably a shitty TN panel with buggy software (montitors are actually just shitty embedded computers connected to a panel in case anyone here doesn't know that).



I've used 40 monitors on my Linux machines (I'm reviewing them) of all shapes and sizes and generations from 1990s to 2018 and never had a problem with not being able to generate the correct timings. Usually the monitor just tells you the optimal one via EDID and you use that...



CRTs have no input lag because they don't pointlessly buffer the image before outputting. once the current pixel on the wire is received it's output within a few microseconds to the current raster position


high quality CRTs are still better than most (or all) LCDs


>Shit viewing angles and contrast is literally not true anymore. Stop being poor and afford yourself a nice screen from this century, kiddo.

TN - turbo shit viewing angles

VA - Shit viewing angles

IPS - decent viewing angles. lots of money and shitty pixel response curves and overshoot, also they commonly have input lag on top of this

>Also yeah, you're totally gonna notice even 50 ms of input lag, idiot.

i """notice""" 16ms of input lag caused by double buffered vsync. any FPS game is on a whole different level once you turn double buffered vsync off


>crt can't even render horizontal or vertical line properly

Negligable compared to issues calculator screens have.

>eyelet can't notice the interlacing flicker

Computer CRTs don't interlace. Flicker has been solved 3 or more decades ago when they stepped the refresh rate to 75Hz/85Hz and far beyond. My CRT goes up to 160Hz (at lower resolutions).

>earlet can't hear the [buzz]

Yeah that's a problem in some of my CRTs unfortunately. Of course you have to contend that some LCDs do this shit too, and that if you're listening to music all the time like a real G, you can't hear the buzz.

Are you trolling? Wire speed is not an issue for human detectable input latency. Meanwhile, buffering an entire frame, as many LCDs have done for decades, causes product breaking input lag. I don't know if it's still as bad but in 2006 era, stuff like the Dell 2001FP had terrible lag (as bad as turning on double buffered vsync at 60Hz).

>the specs using HWiNFO and tell that it's just 1-3ms output latency plus

There are no specs to tell you the input lag. It's in the monitor's chips/software. Output is buffered inside the monitor itself. 1ms would be too much. Lag should only be added by the application to allow flexible graphics programming. You can't have a shitty monitor taking up all the headroom. The EDID now has a field for input lag which most manufacturers either leave blank or fuck up since they don't even know about the concept.



Set your monitor to 60Hz.

Download any game ever.

Enable double buffered vsync.

Enjoy your 16ms input lag.

Report back and tell us how you felt. Protip: I'll just laugh at you if you're a cretin who can't feel this.

The reason people know about vsync input lag, is because in 2005 they bought a new game that has vsync by default, and couldn't move the mouse around properly, and googling "mouse lag" gave them the solution of "disable vsync". Also the ear can feel around 8-13ms input lag IIRC. But the last time I tested this was when i was 9 years old. Also it depends on the base input lag.

An LCD already has 16.66ms input lag at 60Hz. Putting on vsync adds another 16.66ms. USB Mice are said to have 8ms input lag.


>IPS glow

also shit pixel response curves, and possible input lag


network lag isn't really comparable to input lag


What's your point? I have two CRTs like what you describe which indeed weigh 70lbs but are still superior in every other aspect. The weight/size argument is completely retarded made up shit. What dudebro says:

"I don't have space for it"

what dudebro doesn't want you to know: I already have a hard enough time getting a girl without being a tinfoil hatter!111

>even that is not that sharp anymore due to wear and tear it sustained over the almost two decades since it was manufactured

false. the real problem is they will die in a few years from the phosphors fading

>colours off and tinted, trying to adjust it via the normal menu doesn't help much at all

You're literally describing an LCD.

>one of the colours became laggy and leaves a smeary trail on screen, making it useless for gaming or even movies with lots of fast moving scenes

ONE of the colors. That is the white to black transition. Or any color of similarily high intensity. Meanwhile, the LCD's transitions are slow on EVERY transition, and when not, they have overshoot which just does the same thing. But that doesn't even matter because the motion blur caused by "sample-and-hold" on LCDs is so high that you never even see the smearing. The entire image on any LCD is blurred when you see a moving object on any content that matches the refresh rate. To get an LCD with no motion blur (e.g a proper implementation of ULMB/ELMB/etc), the LCD would need microsecond transitions, which still doesn't exist.


Movies aren't a big issue for motion blur. First, they're engineered to avoid any fast moving objects, because they look horrible due to judder and display motion blur (doesn't look the same as real motion blur) and backround strobing. Just because the car says 200mph, doesn't mean any object is moving across the screen that fast. Movies are recorded with long exposure cameras to deliberately add blur. Second, movies are 29FPS.

>uses up to 180W of power

the display industry is such garbage that power is literally not an issue. the amount of time you spend looking for a non-shit monitor will cost more than any power you might be able to save with a particular monitor (assuming you can even find one that satisfies all the other needs and also has low power)

>because of the above it emits huge amounts of head, doubles as a literal space heater in the winter, but not much usable during hot summer

exaggeration. neither are true


>one of the VGA interfaces is semi-broken too, sending no EDID data and needing you to fuck around manually with xrandr and/or modelines in xorg.conf to even get the display mode (resolution/refresh rate) you want to work

EDID usually just states a few standard resolutions (listed in the VESA DMT spec). You use the standard 1280x1024 and it works right away. GTF also worked every time for me so far. EDID can list a completely custom modline but i haven't seen that much or at all



okay have a new argument. widescreen sucks because viewing angles become shit (unless you have an IPS, but then lag becomes shit). also curved tries to fix this by making you only be able to view the display head on, while marketing it as "more immersive"


you could always use GPU scaling or even in software...

scaling is only a problem in retarded monitors anyway, but yeah, good luck finding one with a good image and also non-retarded firmware


i'm on a laptop from 2006. it can play 1080p content with software decompression.


you can literally walk into a thrift store or garage sale and get a 5:4 monitor for $5. i don't think 4:3 is much more, just harder to find


lol 2:3 pulldown is noticeable ALWAYS, not to mention 29FPS or 25FPS or whateverthefuck looks like slideshow. "i remember when i was a kid i'd be like wtf the movie is lagging, that's not even possible, why do they do this shit?" when watching a VHS. a single duplicate frame at 60Hz (16.66ms) is noticeable too, for an object moving across the screen. not that any of this has anything to do with input lag


>also 4k is abslutely worthless and just makes working harder in most software. I really hate trying to move a single pixel even on my hd

>software is shit therefore some type of display is shit

non argument. i have no idea whether 4k is good though, in the stores they show rigged content. and even more rigged for OLED


buying new hardware for a POS display is 100% company-to-company dicksucking bullshit. lots of places here actually just buy used monitors for $5 for their company. and others will charge their company $100-$500 in materials.


seek help




bump so more charlatans can be BTFOd next week

[Return][Go to top][Catalog][Nerve Center][Cancer][Post a Reply]
Delete Post [ ]
[ / / / / / / / / / / / / / ] [ dir / animu / dempart / komica / leftpol / mde / tingles / ungall / vg ]