All posts by Sam M

Modern Monitors (Part 1)

hardware

When i say modern monitors i’m referring to LCD displays or liquid crystal displays. Liquid crystal displays work buy having polarizes that will block light of of specific polarization. These polarizers are specially manufactured films that will let light moving in a certain way to pass and block others. We then sandwich liquid crystal between these two different type of polarizers. The liquid crystals allow use to use electricity to change the polarization of the light, which in turn allows the polarizer to block that polarization of light. The liquid crystals are electronically charge and uncharged several times a second this is their refresh rate often measured in hertz. The LCD screen uses a very bright back light to produce light from behind these polarizes. The light has to be very strong because the polarizes in front of it are there to block a large portion of it. In front of this sandwiched array of polarize – liquid crystal – polarizer are very small colored filters of three primary colors: red, green, blue. Together these three small filters make one pixel. The screen you are most likely viewing right now is a huge array of these pixels.

During the 1970’s LCD were monochrome and used often in calculators. In the 1980’s the first LCD televisions were being developed into handheld televisions. By the late 90’s LCD monitor were becoming the standard for computer displays. The benefit of LCD were tremendous they were much thinner and weighed only a fraction of the weight of a CRT monitor. They drew a lot less power nearly a half of a CRT display. The picture was drawn differently then a CRT. The LCD would refresh almost instantaneous the entire screen as many as sixty times a second whereas CRT drew each line with a electron beam. But with our electronic manufacturing the liquid crystals began to shrink like transistors did in semiconductor manufacturing this lead us to increasing resolutions. Basically our ability to make pixels smaller and smaller meant we could put more and more in a same area. Now many modern televisions are 4k meaning they display resolutions 4096 × 2160 meaning they have 4096 pixels lengthwise and 2160 pixel in height. This also affects our handheld devices with the iphone X boasting ppi of 326. That means they fit 326 pixels into a inch of space or pixels per inch.

In the beginning they also used to horrible VGA connectors and DVI connectors. You can read more about those here. But in the 2004 HDMI or High-Definition Multimedia Interface can to market. HDMI was a revolutionary new standard the connection port was small is was all digital it could carry audio, and data, and it was backward compatible with DVI. Which meant you could buy an adapter and change your DVI output into hdmi and it would work but without audio….

In the next part i’ll go over variable refresh rate technology, curved screens, OLED, and the newest connector Display Port.

90’s Monitors

hardware

When i first started dealing with computers 20 years ago. We used CRT monitors also know as cathode ray tube monitors. They were huge and heavy most of them weighed in at 20 pounds or more. They weren’t ideal at all. Cathode ray tubes themselves were huge pieces of glass with metal screens that had cathode rays embedded inside them. Because of the physics involved in their operation they often had to be shielded to protect the integrated circuit inside computers from interference. This in turn lead to personal computers which at the time were most popular in the desktop format had to have these heavy metal cases design to support these CRT displays sitting on top of them.

 

Back in that period they used 15 pin video graphic array connectors VGA for short. These connector were some of the worst they were analog which meant more shielding and separation for better signals. That made these cables thick and stiff, not in a good way. The actual terminations on the end of the cable were also the worst the often caused the cable to stick out like 5 inches from behind the computer. Also they had these thumb screws on either side of the terminal. The connectors that they plugged into were also a huge hassle they had these female receptacles for the screws and for some reason they could be unscrewed from the terminal. Often after removing a cable these female screw insert would follow stay on the cable, and have you would have to take them off with a pair of needle nose pliers. Because the girth of VGA cables if you didn’t have theses screws in place whenever a user would adjust the monitor you would need to be called over to fix the issue.

Then next “innovation” were DVI or digital visual interface style cables. They were very similar to VGA cable but would carry digital signal, but nothing much changed. The DVI cable were just as girthy as VGA probably even thicker. They kept the horrible screw terminals. Actually they kinda made CRTs worse because of the digital cable the monitors got bigger. Some users in graphic design like making sign wanted 27 inch monitors which is considered small now back then a 27” CRT weighed in at around 50 pounds and would take two people to lift into place. Usually user would want to tuck these workstations in to corner so those big cable would be push into sharp angles which was really bad for them. This often cause issue like flickering, color artifacts, or scanline artifacts.

The thing is these CRTs stuck around for a long time because often people would invest hundred of dollars on them. But nowadays we rarely see them, some retro gamers still use them because they give their games the scanlines that they were used to. I remember some of the brands that would make these CRTs like Viewsonic, NEC, and Sony Trinitron. Amazingly the one thing that still round are those horrible VGA connectors. They are often used because they are cheap and

PC Parts Prices Pressure Past Present (Part 2)

hardware

I remember in the at the end of 2011 i was trying to build a network attached storage server also known as a NAS . A NAS was a server design to store files and server them to computers on the network. I wanted to build it with a hardware redundant array of independent disks also known as a RAID array. I was going to use the freenas as the operating system. Freenas was a linux based operating system design specifically for Network attached storage devices. One of the coolest feature of freenas was how small it was, and it could be run from a USB Drive.

I choose to use a server case by Norco. They design server cases that use standard PC hardware. Most other server cases use proprietary motherboards and require server hardware like Xeon processors and EEC Random access memory. EEC RAM is ram that performs error checking operation so the memory is more stable. The Norco case would also fit into standard server racks. Also a downside to them is they don’t include any documentation.  I will post a listing for the case.

 

At the time hard disc drive (HDD) were very affordable. A two terabyte hard drive was around $70. Most manufacturers have several “grades” of hard drives they would sell. Back the western digital had their green, blue, and black drives. The green drive were usually the cheapest and they were design for normal consumers. The blue drive were made for the enterprise clients, they were more expensive and had much smaller capacity. The black drives were their high end consumer drive with pricing and capacity between their green and blue drives. The black drives had better warranties and were rated for more dependability. Now Western digital has several other colors red for their network attached storage, purple for surveillance. At the time i purchased about ten hitachi and seagate “green” grade drives

Unfortunately during the end of 2011 there was a massive flood in southeast asia. Along with the horrible loss of life many of the industrial areas were decimated. About twenty-five percent of the wolds hard drives were manufactured in thailand, and those factories were badly affected by the flood. This caused hard drive price to nearly double for the next two years. Backblaze a cloud backup company was also severely affected by this event. They have an interesting blog in it they explain how they resorted to buying external hard drive from costco to “shuck” the external enclosures to place them into their back-up machines. You can read their blog here. I don’t believe is took 2 years for the factory in Thailand to rebuild. I think the companies used that time to recoup some of their lost assets and slowly ramp down their prices.

 

This series of unfortunate events caused me to have to pause the construction of my network attached storage server, but it also gave me time to prune all the files i had accumulated.

PC Parts Prices Pressure Past Present (Part 1)

hardware

I have been making PC for about 20 years, and in that time i have seen market forces affect personal computer parts severly a few times. The thing about computer hardware is the more advance it get the harder and more expensive it becomes to fabricate the parts. For example, the x86 central processing units in the 90’s were manufactured by Intel, Advanced Micro Devices, VIA Technologies Inc., International business Machines, Nippon Electric Company, Limited, cyrix, national semiconductor, and about 8 others. But now the market has dwindled to just 2 Intel and AMD. In fact most advanced integrated circuit manufacturing is now consolidated into a few chip foundries that can produce <16 nm die size chips: Samsung, TSMC, Globalfoundries, and intel. You can read more about it here. These four companies currently control the supply of the most advance and smallest chips. These chips not only affect the Central Processing units of computer but the RAM, graphics cards, and Solid State Drives as well. Also they control the supply of component for smartphones and mobile devices.

 

More importantly, since 2006 Intel has enjoyed market dominance against the only other competitor left in the CPU marketplace, AMD. The long and short of it is that Intel focused on larger cores with more features while AMD chose to focus on multiple smaller cores. During the period from 2006-2016 many software developers choose to focus on single core performance, but now we are seeing the majority of computing moving to multi-core. This allowed Intel to be able to charge whatever they wanted for CPU during this period with AMD trailing behind offering their CPUs for often hundreds of dollars less.  But that all has been changing recently AMD is now offering their Ryzen series of CPU that rival Intel’s in performance also for hundreds of dollars less as well. It’s a strange time in the cpu market now Intel is stuck in a “tock” cycle while AMD is charging ahead with hug multicore chips like the threadripper. Yout can read more about “Tick Tock” chip advancing cycles here.

 

Outside of CPUs there have also been huge price shifts in Graphics components and RAM since 2017. Before 2017 graphics card were mostly specialized hardware mostly utilized by gamers, computer graphics artist, video editors, and scientific and engineering fields.But in mid 2017 prices nearly doubled. This is mostly due to the soaring popularity of cryptocurrency. Cryptocurrency is basically a type of currency generated by computer hardware and use of electricity. For a long time the value of cryptocurrency against dollars was stable and barley kept up with the price of electricity and hardware it cost to make it. But recently speculators have boosted the value way above those resulting in a shortage of graphics cards and ram. This has caused the price of graphics cards to almost double. The trend has started to fade and the prices of graphics cards are starting to stabilize.

 

The future of PC parts prices is going to hard to predict. With dies sizes hitting the limits we’ll have to see if it will cost more to overcome those limits.

The End of the Tick, Tock upgrade cycle for Central Processing units in modern computers (aka big upgrade, small upgrade)

hardware

So, first off we have to go over some background on how modern computers work. Modern computers run of the transistors. Transistors are essentially tiny switches the can be flipped on or off by the use of electricity. If you want to read more about transistors i’ll put a link here. A microprocessor is basically a series of these transistors arranged in a way so as to be able to compute mathematical problems. The Arrangement is vital of course, and is a very large field of study we call Microprocessor architecture. But just as important is the links between and making up these arrangements of transistors, call “bus” lanes or just bus. When computers were starting to be develop in the 70’s, they usually took up a entire room or hundreds if not thousands of square feet. This is due to the fact that in that time period and computer development computers used vacuum tubes. But thanks to the hard work of computer, material, electrical scientist and engineers we can not fit all that power onto our wrists. They did this by using semiconductors (a material that can conduct electricity under specific conditions ) and lasers/light to draw the microarchitecture onto these semiconductors.

 

Since the late 80’s Intel have released major cpu products in and “Tick, Tock” fashion. “Tick” being a major upgrade to the processor like a die shrink. A die is essentially the arrangement of buses and gates (transistors). “Tock” usually being a architecture refresh. Tock refresh usually involving adding features and refining the architecture. Tock upgrades usually involve the removing space in between blocks of transistors and crating more efficient paths for the electrons to flow. You can read more here. Intel’s co-founder Gordon Moore famously stated computer processing power will double every 18 months due to this “die shrinking”. Moore’s law has essentially held true until recently. The problem is once you reach the dies sizes around 10 nanometers electrons stop flowing nicely in their bus lanes and skipping over into others. This causes errors in computation, so essentially we see a stop to moore’s law. Which leads us to the end or at least the slow down of the “tick” process of shrinking die sizes.

 

Scientist and engineers are hard at work at finding solutions to shrink Die sizes even more. They have been using 3 dimensional architecture and exotic for of lasers to guide electrons even better. We can see from intel that they have stopped releasing yearly die shrinks around 14 nanometers (aka broadwell architecture) first release in september ninth 2014. Their processing architectures Sky Lake, Kaby Lake, and Coffee Lake have all been made with their 14 nanometer architecture. The next “tick” is their Cannon Lake architure which was supposed to be release in 2016 has been delayed till 2019.

 

What does this mean to the modern consumer? Well the pros of slower ticks are slower update cycles. That means consumers won’t have to buy a new product every few years. The cons are their device will stop getting more powerful or small as quickly.