Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Is this technologically feasible, given that an isolinear chip, quoted at 2.15 kiloquad in the TNG TM is about the size of a microscope slide?

0
Posted

Is this technologically feasible, given that an isolinear chip, quoted at 2.15 kiloquad in the TNG TM is about the size of a microscope slide?

0

The problem with the previous theory is it assumes Star Trek computers are normal binary computers. The advances in present day quantum computers means it is quite likely Star Trek computers will be quantum computers, not binary computers.

Moore’s Law is an observed phenomenon that computer capacity doubles roughly every two years.

My theory is that a kiloquad is an exponential measurement of memory. 2100 bytes is 1 Kiloquad, 2200 bytes is 2 kiloquads. If you extrapolate Moores law to take a 256Mb USB memory key from 2011 out to 2365 when TNG is placed, then the isolinear chip (which is roughly the same size as a USB key) would hold 2215 bytes, which fits this exponential model for Kiloquad exactly because we know one Isolinear Chip is 2.15 kiloquads.

As for whether it is technically possible, look up "qubit" on Wikipedia. This is the "bit" used in quantum computers. in 2009 the first quantum computer was created by Yale University. It had only 2 qubits. But by TNG time quantum computers (or their successors) would be the norm.

The thing about qubits is that while they only have two states (0 and 1) like normal bits, they can store ALL those states at the same time. A normal bit can only store one of those states. So that means every time you add a new qubit you DOUBLE the memory capacity in your computer. A qubyte of 8 quebits in a quantum computer holds the equivalent data as 256 normal bits in a binary computer. 9 quebits doubles that to the equivalent of 512 normal bits, etc. Also, one method proposed to store quebits is optically using polarized light.

So while the previous technical descripton may not be feasable for a normal binary computer, it could be quite feasable for a quantum computer.

As for 1 quad = 1 quadrillion bytes, that’s completely ridiculous. One quadrillion bytes is only 1 Petabyte (1024 terabytes). We have hard drives that size now. USB keys in January 2011 hold about 256 Gb. Moore’s law predicts we should have 1 Terabyte USB keys by 2015, and 1 Petabyte USB keys 20 years later in 2035. So surely in 2365 we would have moved far beyond that.

0

From H. Peter Anvin: …The 2.15 kqd isolinear chips [would have] a bit density of 2.94e+15 bits/mm^3 (I have assumed the dimensions to be 90x30x2.5 mm, this is probably on the high side if you exclude the part where you handle the chip); that means each bit could form a cube 7.0 nm (70 [angstrom]) to the side. The chips are optical, which I assume means they are read and written with electromagnetic radiation that behaves somewhat approximately like light. 7 nm is in the far ultraviolet region-near X-ray region (visible light ends at about 200 nm) which is really pushing the limit. Assuming some form of multi-state encoding that may exist may push this down to near UV which would then be a bit more practical to deal with, and more “optical”, but that is irrelevant. Hence, what we “know” about ST computer technology seems to correlate pretty well to the definition 1 quad = 1 quadrillion [American] bytes. It may be bits or bytes (it is only a factor of 8, obviously… it changes 7 nm to

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.

Experts123