[ / / / / / / / / / / / / / ] [ dir / animu / chaos / had / htg / kpop / sonyeon / strek / sw ]

/tech/ - Technology

Winner of the 24rd Attention-Hungry Games
/kemono/ - A match made in heaven

Comment *
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Show oekaki applet
(replaces files and can be used instead)
Password (For file and post deletion.)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, pdf
Max filesize is 16 MB.
Max image dimensions are 15000 x 15000.
You may upload 3 per post.

File: 56845a737a89611⋯.jpg (16.19 KB, 195x260, 3:4, Fowler's Universal Calcula….jpg)

File: 55051482d8f20dc⋯.jpg (44.66 KB, 500x394, 250:197, a3071ddf553147e2b5706cb58d….jpg)

File: 9f52f973ee2a664⋯.jpeg (7.05 KB, 272x185, 272:185, images (1).jpeg)


What would computing look like if we redesigned everything from the hardware up? with no attempt at backwards compatibility? Reinvent computing in 2018


>no attempt at backwards compatibility.

A dead project.


Mill CPU


There wouldn't be an OS and the hardware itself would handle things like scheduling, memory management, garbage collection, I/O, virtual memory, files, and access control. There are also HLLCAs, a non von Neumann architecture that actually executes source code of programming languages. The CPU has components for parsing statements and expressions instead of decoding instructions. There are books on this from the early 80s that described what was supposed to be the future before x86 and RISCs took over everything.

In an alternate universe where these designs became popular a long time ago, we wouldn't even be able to imagine what their 2010s computer would be like. The most recent thing I can find is TIARA from MIT, which is a descendant of the Lisp machine, but that looks like it was abandoned (or the NSA wanted to keep the technology out of the public's hands).






That's the dumbest shit I've read on 8chan in my entire life. The hardware does all that shit? Do you have even a modicum of knowledge of the complexity of these subsystems you want to implement in unmodifiable silicon


I'd turn the 0 and 1's into 0 and 8's, just because it would make AI more plausible.

0 8 88 8888 etc


I don't know what it would look like, but for sure the hardware would give the user absolute control, such as the ability to halt execution, inspect & modify registers & memory, set breakpoints and step through... all done at the hardware level, and at the most priviledged level (everything else would be less priviledged, including any firmware).



You can already do that



If you enjoyed Intel CPU bugs that don't implement an OS, you will thoroughly enjoy your CPU buts that implement a full OS.



Not if you're using current-generation CPUs. There'll absolutely always be black boxes with higher privileges than you.


File: 5764734394f1062⋯.mp4 (656.34 KB, 320x240, 4:3, squidward future.mp4)









Use trinary just to see what happens



Everything is reconfigurable configware/flowware running on FPGAs.



I think there was a Soviet computer that did that.




The Soviets played around with trinary for a while, that could have some interesting benefits over binary but the hardware issues were tougher so they copied the West instead.



Is trinary not what quantum computing is?




Quantum computing has three states. Trinary computing has three states. How are they different?



Instead of 1's and 0's you have -1, 0 and 1 as well. Immediate benefits appear such as not needing to sign numbers with tricks like we use now, also it seems that code density should improve somewhat as well. You could also implement more flags and features in a given trit width architecture for instance.


What if current day quantum computers are just tricky forms of trinary and it's all snake oil? I wouldn't put it past (((them))) after knowing what we know now.



Doing some Wikipediaing, the number of states goes up the more qubits you add. So a 1 qubit quantum computer has three states, but 2 qubit would have 4 and 3 qubit 8 states.


I kind of had a branching fractal ideas that would replace Von Neumann architecture back in my edgy "im cool mom, fug off" phase but kinda forgot about them and lost the notes.

Would like to see some new type of architecture tho. Von Neumann is too rigid in my opinion.



> So a 1 qubit quantum computer has three states, but 2 qubit would have 4 and 3 qubit 8 states.




I'm not claiming to know anything about quantum computers, but I don't think a superposition is a third state. It just means the state of the bit is uncertain, with some probability associated to it. When you have 2 bits, they can have 2^2=4 states. (00 01 10 11) When you have 2 qubits there are still these 4 states, but the difference is that you can only manipulate the probability that the qubits are in one of those states. I think.



Yeah and it didn't do anything spectacular compared to binary. It just stores data in a different fashion and all that does is makes it use less bits per one decimal place or text chracter. That is all.



Tertiary not "trinary" you faggot = 3 possible states: -1 = OFF, 0 = MidRange (e.g. 5v), 1 = HighRange (e.g. 10v)

Quantum = 2 Possible states: 0 = OFF, 1 = ON

Quantum computers however ever bit both 0 and 1 at the same time and is undetermined until such time as it is observed ala Heisenberg's uncertainty principle



wow I butchered that.

Every qubit in a quantum computer is both 1 and 0 at the same time. This allows you to perform certain functions, like prime factorization, extremely fast as you can compare outputs a magnitude of 2 quicker per qubit



the bigger question is weather or not the quantum computer will come with ME or PSP



Why not both?





Replace von neumann with harvard architecture or make purely functional CPUs like urbit but in hardware and less kooky


how about no?



I binged on the lecture videos about the Mill CPU recently, they gave me an immense tech-boner, but it's still very much based on being backwards compatible.


>>850315 Curious how long it would take for an original x86 type design.


Analog CPUs ftw.



That's because in the world of business sits upon using old but good working software that long lost its parent company let alone source code and cannot be ported to any other platform, so they stick with hopelessly obsolete OSes and hardware just to keep using it. And it's not your cookie cutter office suite type of software either, they're all purpose built so you can't just replace them with something else.



Intel does implement a full OS, and apart from a few bugs it’s been running smoothly in tens of millions of devices for literally decades.



Zilog processors have been running on tens of billions devices for few more decades than Intel did, never had any problems. You can just admit that Intel does substandard job at CPU design.



This was new to me. I'm familiar with VLIW as used in the Itanic architecture. How does Mill differ? Is it VLIW done right?

I note one advantage of VLIW is that it eliminates speculative execution and hence is immune to SPECTRE vulnerabilities.





And no. Ternary means that instead of 0 and 1 there is 0, 1 and 2, thusly enlarging the informational capacity and speed of processing said information.

Basically with same amount of transistors and little bit of electrical engineering trickery you could speed up your processor. Thing is, this shit needs to be built from obsidian bottom to top, which is why nobody fucked with ternary but Soviets in 70s (and they only made ternary computers transistor by transistor, not this silicon dye shit)



It's basically VLIW with variable-length instructions. ~33 instructions/cycle/core for a 32-bit design as of 2013.



Fractal computation? What does that consist of?



It's not faster. A binary computer evaluates data bit by bit, a ternary computer evaluates data trit by trit. You can scale the register size however you want to process more data units at a time, but the principle stays the same. Using wider words is only a speedup in a sense that computer only capable of short words needs to carry out multiple operations to evaluate a long word. Using wider words doesn't allows you to compute multiple short words at a time, you need vector processing for that and again it scales the same way, it doesn't have to do with the type of data storage. Nobody makes ternary computers because nobody can get transistors to work well in linear mode under such extremely tight constraints and huge voltage tolerances - it's hard enough to make it work in binary.



The Soviets claimed that they were getting better state transition speeds with the ternary transistors. Americans couldn't replicate it, then spent billions of dollars making fast binary transistors. One of the reasons that it wouldn't be faster is because ternary transistors are still in the 70s/80s.


Just passing to say that Quantum computing will never happen because the theory behind it is wrong and therefore its predictions.

That must be why all the hype so far has been from simulations and nobody actually managed to make it.



Soviets were claiming a lot of horseshit. Anyway to get a transistor to work with ternary data it needs linear mode, but at current scales quantum effects dominate the physical process of transistor operation, making it difficult to distinguish between two opposite states, let alone tell some third state in the middle.



Also in linear mode transistors' efficiency plummets and they start dissipating shit ton of heat while drawing shit ton of power. Hence transistor based power supplies all work in PWM mode, not linear mode. For all the drawbacks of having power only going either zero or full blast, it beats using 10x as many transistors and batterries/electricity and putting huge radiators on them to boot.



Enjoy the safety of a dual security system.* *At the cost of 1970's performance. Inovation that excites and is more diverse



Nigga what , they are real. Just look up dwave!



He's suggesting dwave is just using a simulator.


I want to redesign the entire world and be paid in food, clothes and a room to live in.

Everything else is BULLSHIT.



You dumb faggot

Trit has more information stored in it than a bit

It takes 2 trits to store information of around 3 bits

Dumb cuck



Also, linear mode is not needed

Only negative and positive voltages

t. EE student with uni project based on ternary computers



>muh politics dictates mathematical truths

This is why, even though you are somewhat intelligent, you will never look further than "I know the best" attitude. Your potential bright future will be denied by your bloated ego.

If you looked up maths behind ternary computers then you would see what I am talking about

go sesrch for texas masters paper on ternary computers



>what is D-wave

>what is IBM quantum processor

Shiggy the diggy



Marketing brands



Dwave is not a quantum computer.

IBM Q is a research product, it's not something that you can just buy and run turnkey.



And just how do you drive a transistor with negative and positive voltage, to get it to output negative or positive voltage?



And decimal has more information than both, it takes just over 2 trits to store one decimal place. Your point is fucking retarded and is not relevant.



Draw a diagram of the operation, blanket explanation will not suffice.


File: b93ed5cab46c740⋯.png (13.41 KB, 600x488, 75:61, glXad.png)


Dumb nigger


Moving the goalpost

Also, yea Dwave is quantum



I won't teach you basics of electrical engineering, but here is a pic of ternary not gate for you, dumbo


You do not know how decimals in binary compiters work

Or what is informational capacity of trit is




Also, here is truth table for ternary not gate because you are an idiot

Input: -0.5v 0v 0.5v

Output:0.5v 0v -0.5v




Good. Now show me how you pack this gate into millions of transistors.



Here is further explanation for your dumb ass

M2 is pchannel mosfet

M1 is nchannel mosfet

M2 connects output to 0.5v when gate is lower than source (ie -0.5 < 0.5)

M1 connects output to -0.5v when gate is higher than source (ie 0.5 > -0.5)

If 0 is in the input then you have direct line between 0.5 and -0.5

if high capacitance hanging input then resistors make a voltage divider with 0 on output



NOT gate is trivial, post XOR and NAND.



>into millions of transistors


>logic gate into transistors

You have no idea of what you are talking about, do you?

Do you know what planar technology even is?

Do I need to explain to you how silicon dyes work? Wtf? Are you this pompous and dumb at same time?



There is a whole book on maths and electrical circuits of logic gates of ternary computers

You do not want to know about it, you want to win this discussion

You are too lazy to learn for yourself, I am too lazy to copy paste shit for you



>hurr durr educate yourself

Opinion discarded.



Yes Yes Yes No



Nice non-arguments, with non-knowledge.


File: 3732fb1294bff5b⋯.png (10.17 KB, 344x600, 43:75, and.png)


Actually nevermind, 3 seconds on google revealed that you yourself pulled that shit from google. Here's an AND gate. Evidently it takes twice as many transistors to build a ternary gate than for binary gate. It is my assumption that it takes twice as many elements to make a ternary latch too. So it halves element density, die of any given size will have twice as many bit-wise shit in it than trit-wise. Meanwhile using ternary does not halves element count, it only goes by the factor of about ⅓. Therefore you're at a net loss of computational power per unit of die area just for using ternary instead of binary.



Yea I pulled it off jewgle because you are too dumb to do it yourself and why would I waste any more time on you than that?

Also, why make a picture when you can just download it?

Next thing, your assumption

You are fucking wrong

Other than that pretty good, except not realizing that trits have bigger info capacity, therefore bigger speed for equal amount of transistors



They use the same transistors, therefore they switch at the same rate. Therefore the gates operate at the same speeds. You need log3(X)/log2(X) = 0.63 as many gates to compute a number of X length, but they're double the physical size, so you're at a 26% net loss of performance when space constrained or at 26% increase of die size otherwise. Come up with transistor that can actually handle three states, then we'll talk.



>but they are double the physical size

Wrong assumption again

Go away and come back when you learn more about the topic

you never heard of how ternary logic would work with negative voltages few hours ago, stop acting like know it all



Two transistors are double the size of one.



What teo transistors are you talking about?

Just based of one AND gate you got to that conclusion?

You do realize they don't use logic gates on processors to and do optimization to utilize more space?

You know NOTHING of what you are talking about. Nothing.



You have two current directions. Transistors don't work in two directions, so you need a transistor for each direction at each junction. Therefore two transistors are needed where binary system only needs one. It doesn't even have to do with gates, just basic physics.



I'd probably go with a -1 to +1 ternary for the sake of memory (0,1,2 strikes me as more error prone) and place a greater emphasis on distributed computation. We might be capable of optical computing at this point, as well.



dumbest /tech/ post today. congrats. Garbage collection in the hardware? Really? I think /r/programming would be more comfortable for you.

Thinking about it a bit, I really only want one thing: fully documented hardware and standards. Far more strict interface and communications standards between any components that are meant to be interchangeable. An open and documented boot process that is completely under the end-user's control. Black-box hardware can be interfaced into the system but it is quarantined to a restricted bus and separate memory pool with bottom-bitch status DMA onto the main memory bus. Any black box hardware that can't adhere to the open standards gets the functional, but slow lane to account for their probable inclusion of street shitter firmware coding complete with NSA backdoors. Back on the boot process - a real BIOS with reasonable, if unoptimized, system functions in firmware - somewhat like the Amiga. Also none of this new bullshit where critical parts of the peripheral hardware's firmware is absent until this new convoluted bullshit boot sequence loads it up from *somewhere* (most definitely an ease-of-use feature for NSA).

I guess what I want is control of my system back.


Something that addresses all points made in https://blog.invisiblethings.org/papers/2015/state_harmful.pdf.



>>muh politics dictates mathematical truths

Given that what I said was that they were getting better speed out of the machine due to its fundamentally different hardware, I think that you are the only one whose politics dictate their mathematical truth. While the things the Soviets said should be taken with a grain of salt, dismissing them out of hand only reveals your own political biases.

> If you looked up maths behind ternary computers then you would see what I am talking about

> go sesrch for texas masters paper on ternary computers

After a quick Google search, I'm going to have to assume that you made "Texas Masters" up. Props to you on sending me on a wild goose chase for a person with an unfortunate name that fills search results with tons of unrelated shit.

The only things that I can find on ternary computers is a bunch of Computer Scientists (applied mathematicians, not electrical engineers) saying that it would be awesome to have ternary computers because they have better information density.


File: e15066438600907⋯.jpg (823.12 KB, 1080x1920, 9:16, 2018-01-14 13.11.45.jpg)


>>I made my shit up

>even though I glued up the paper to read for my project

Go kys faggot

Your weak Jewgle skills are your own problem, not mine



take notes from >>850441 on how to not be a huge retarded faggot



I mean that's cool and all, but you're basically trading compromised firmware on the chip for compromised firmware binaries in your "trusted device". I don't see the security benefit at all.



The point is to have the root of trust be an owner controlled key versus a hardware vendor's or Microsoft, whether the owner supplies it by toggling it in from a piece of paper on every boot or some other difficult to compromise system. If we're designing a computer from scratch, we don't need to rely on USB firmware.



What does the key have to do with the firmware still being compromised? You still have to download the firmware for your wifi chip from somewhere, and nothing forces them to make it open source.


2018? Nothing.

What we are stuck on is von neumann architecture, the problems decades ago are still the problems today in many ways.

We should redesign when memresistors are real.



Since you would control the root of trust, you would also control what firmware and software etc. the system trusts to run, all the way up through bootstrap. Keep in mind this is an ideal, from scratch computer. If you don't trust the firmware, don't infect your system with it. Probably isn't a way to accomplish this on today's pozzed x86 designs.



>What would computing look like if we redesigned everything from the hardware up? with no attempt at backwards compatibility?

Nobody would use your products and the wintel machine would carry on as usual.



HP cucked out of memristors, the whole thing is total vaporware.



That's not an answer to him question you cock sucking faggot



> Doesn't give a university name

> Doesn't give a human name

> Expects someone to find anything with the vague-ass answer he gave


> How can a ternary ALU compete with the new generation of binary ALUs? The key is in the size of computing numbers. A four bit number can compute a number the size of 16. The four trit ternary ALU can compute a number the size of 81. This number is approximately five times as large. Therefore, the ternary ALU can be as slow in circuit speed as five times slower than the binary ALU and compute the same size of number faster than the binary ALU. This means that if the state-of-the-art for addition is 7 ns, the ternary ALU must only add faster than 35 ns in order to compete with the binary ALU. The design of the ternary ALU constructed utilizing multi-valued logic and, compares, if not surpasses, the computation time of a binary ALU. There are additional advantages in implementing ternary logic instead of binary logic.

-- Page 95 (101 of the PDF)

Your arguments so far for ternary being crap are:

1) Soviets (ad hominem fallacy)

2) Maths (which you provided a source that directly contradicts you)

As for the Soviets, did they have success because:

> Someone's family would go to a gulag if they didn't claim success?

> Their state-of-the-art was so shit that it wasn't difficult to improve upon, and the machine came at a fortuitous time in history?

> They really had success?

From ( http://mason.gmu.edu/~drine/History-of-Ternary-Computers.htm ) it appears that the answer is the second. I had remembered reading somewhere that the Soviets found the ternary to be faster: apparently that was only true because any new machine would have been faster than its contemporaries due to how shit their machines were.



What's spectacular about binary then?



But it's the factually correct answer.



tbh we should have ternary storage devices if nothing else. Converting binary to ternary would reduce the amount of digits the device has to store by a full third, which is significant enough to be appealing to at least some consumers.



is that what multilayered ssds are? the bits on hdds overlap too iirc



No. A multilayered cell is a cell that can hold more than one bit. A ternary cell would store trits, not bits. Multilayered SSDs increase the storage by increasing the amount of bits each cell can store. Ternary would increase storage by decreasing the amount of digits that need to be stored in the first place. Eg, 11011011 base2 is 22010 base3. 8 bits becomes 5 trits.


File: 74dae1f889da4a2⋯.jpg (16.25 KB, 480x360, 4:3, scuttle.jpg)

Question. It's generally agreed that quantum computers won't be useful for normal day-to-day computations, but will probably be necessary for advanced number crunching, crypto, etc. Sort of like how we need a graphics card to render 3D. Does this mean one day our computers will contain QPUs?



>Your arguments so far for ternary being crap

You are a fucking autist

In whole ITT I fucking defended ternary computing as it being faster and more efficient than binary



Except the complexity of ternary means each cell is so much larger physically that you can't fit as many on the silicon.



>Does this mean one day our computers will contain QPUs

You do realize that for it to work you have to cool it down to 0 Kelvin, right?



>In whole ITT I fucking defended ternary computing as it being faster and more efficient than binary


I said:

< The Soviets claimed that they were getting better state transition speeds with the ternary transistors. Americans couldn't replicate it, then spent billions of dollars making fast binary transistors. One of the reasons that it wouldn't be faster is because ternary transistors are still in the 70s/80s.

You replied:

>muh politics dictates mathematical truths

> This is why, even though you are somewhat intelligent, you will never look further than "I know the best" attitude. Your potential bright future will be denied by your bloated ego.

As far as I can tell, you were accusing me of being a /leftypol/ shill for commies, and that any Soviet success that were had were lies because the Soviet Union was the Evil Empire of subhumans that your Fuhrer failed to destroy. I assumed that I was talking to a /pol/lack that had been exiled from his glorious dictatorship for saying something the BO didn't like. If however, what you intended was that ternary use the same technology as binary and that there would be no speed lag, well, you could have just fucking said so. You demeanor came across as if you agreed with this guy: >>851466



0 kelvin is impossible you dumbo



And I thought you said that Soviets "claimed" (aka faked their results for politics)

Srry for miscommunication



Quantum computers are more reliable at colder temperatures, you don't need to have 0 K(impossible).

There could be any number of breakthroughs that make quantum computing more reliable at higher temps.


>People are still talking about quantum nonsense here




you can't have hardware without software


File: 30869d76a79ebf8⋯.pdf (14.86 MB, Computer-Architecture-A-Mi….pdf)

<Ultimate meme parallelism chip

>OICP (RSSB has some interesting characteristics)


>Belt machine

>Network on a chip




>An equilibrium scenario is not a possible future state of our universe.

Why? Won't at some point entropy be at maximum, or at least arbitrarily close to it?


File: 49cf9d75bcfef4a⋯.jpg (112.24 KB, 1024x683, 1024:683, setun.jpg)




It was called the Setun and had two models. In the period of Krushchev's Thaw, the USSR saw considerable liberalization of arts and sciences, and this included experimentation in cybernetics. For a while, the government, which under Stalin denounced cybernetics as bourgeois nonsense and thus made it effectively forbidden, now showed an interest in its potential use for increasing productivity, as the planned economy was a mammoth task whose solution obviously was far from being optimal. Meanwhile, among other oddities such as a mechanical computer moved by hydraulics, the ternary computer was conceived, and thanks to the technology of the time, a ternary computer was much, much cheaper and easier to build, as it used far fewer componenets and consequents had fewer hardware malfunctions. It was so reliable and cost-effective that the factory assigned to build it tried to sabotage production, preferring to sell big old binary machines because the bigger cost and price were better for their bottom line. So sadly it only sold about 50 Setuns.

There were also I think at least 4 pitches to create what would essentially be a first internet, connecting factories, universities, government buildings etc. in order to optmize production and organization, but as you might guess, none were approved, partially for being overly ambitious, partially due to... political carelessness. And that was even before Krushchev was ousted. Before that happened, the government's brief flirt with cybernetics ended rather abruptly, in some measure because it worked too well. Early government-sanctioned experiments with optimal planning, and the results were very encouraging. The most likely conjecture goes, the bureaucrats that ruled the country realized that such efficient automation of management meant they would lose power. So cybernetics lost the interest of the State, but at least it wasn't verboten as it was under Stalin. Later on, he and his team made a new iteration called Setun-70 (tho the project started in 67), but by then the wave of liberalization was gone, and this new project had its budget and location fucked around with by its university, and the very original Setun was destroyed with a fucking axe and thrown in the dump. The Setun-70 project survived and reached completion, but it was never mass-manufactured. It didn't help that the transistor was becoming the norm, putting binary computers almost on par with the Setun. It still had its advantages, but not enough to escape the enmity of the establishment.

And even in that short timespan when the authorities wanted to hear what cyberneticians had to show, they still had to wrangle bureaucrats and managers every step of the way, not to meantion deal with politicking among their own peers. This reached its apotheosis in 1969, when the government decided to flat-out abandon all native computer development, and instead rely on Western technology, at the time, the IBM 360. This not only guaranteed that they would necessarily always lag behind America's computer science, but wasted a million possibilties that had cropped up in the brief blooming of Soviet cybernetics, including the ternary computer and the automated management. Just imagine it: you're an academic who spent your entire life trying to reasearch computers on the downlow, and when finally you were given a chance, you showed that you and your peers might be able to fix the mess that was the planned economy... only to be chased back into the darkness, as the same bureaucrats that gave you a chance now threw your entire life's work down the drain. It's maddening.

Lastly, an odd note. To his dying day, Setun's creator maintained that a ternary system was superior not just because of computer architecture, but from an epistemological angle as well; it was how the human mind operated (yes, no, maybe), whereas binary's everything-nothing principle "excluded the middle". He would refer to Aristotelian syllogistics as the philosophical base, and Lewis Carroll's (yes, he of Alice in Wonderland fame) symbolic logic. I confess this is way out of my depth, however. Here's his explanation: "Aristotle‘s logic is ternary and that ternary is a necessary, but not a sufficient, condition for adequate logic". yeah, I have no idea either. Regardless, the Setun models remain the only electronic ternary computers ever built, notwithstanding some garage-made novelty.



Base 3 computing. It should be focused on efficiency and security.


File: c3d4b6ec0e903e7⋯.png (130.88 KB, 1256x735, 1256:735, fukt.png)


RISC without the risk


Analog computers. They could theoretically run way faster and in real time.


File: df99d65634fe4ca⋯.gif (1.52 MB, 480x328, 60:41, bq-tell-me-more.gif)



Theoretically, everything analogical is better, but also extremely hard to tame and work with in higher levels.

[Return][Go to top][Catalog][Nerve Center][Cancer][Post a Reply]
Delete Post [ ]
[ / / / / / / / / / / / / / ] [ dir / animu / chaos / had / htg / kpop / sonyeon / strek / sw ]