Jekyll2018-02-15T00:07:58+00:00https://shantanujoshi.github.io/shantanujoshi.github.ioShantanu's blogRunning the LG Ultrafine 5k on Windows 10 & Linux2017-06-01T08:32:00+00:002017-06-01T08:32:00+00:00https://shantanujoshi.github.io/lg-5k-windows<h2 id="heading2"> TL;DR</h2>
<p>You can drive the LG Ultrafine 5k in W10 using a Thunderbolt 3 AIC (AsRock one ATM is the only one I tested) and a motherboard with the Thunderbolt 3 header on it.</p>
<hr />
<p>When the LG Ultrafine 5k was launched it was explcitily “built for mac” where there was no indication of compatibility with Windows devices. In fact, after its release, Linus@Linus Tech Tips in a review explained that Apple was using a special magical implementation of Thunderbolt 3 (TB3) where two displayport 1.4 signals were transmitted via the single TB3 cable making it impossible for non OSX devices with this implementation to use the 5k display at all. In fact after testing most people with TB3 laptops were only able to drive the display at 4K(4096x2160).</p>
<h2 id="heading2"> WHO CARES BRUH JUST USE YOUR MACBOOK</h2>
<p>Is what most people would say, but its important to note that this 5k screen is far ahead of any other 5k screen in this space. It destroys the Dell 5k screen in price, color representation, and brightness. This translates to stellar performance on the productivity front. I was determined to get this goddamn display working with some flavor of linux, but step 1 was windows 10.</p>
<hr />
<p>After some research into the TB3 spec, various motherboard manufacturers, and random tech blogs I learned the following:</p>
<ol>
<li>You can't just plug in a DisplayPort to Thunderbolt 3 cable to power this from amazon because they aren't bidirectional and only convert TB3 to DP</li>
<li>The dual signal DisplayPort is NOT an Apple specific innovation it's in the TB3 specification</li>
<li>TB3 can use up to 8 lanes of DisplayPort 1.2 where as a normal connection uses 4 </li>
<li>Most motherboard manufacturers providing onboard TB3 aren't even providing a single DisplayPort signal for integrated graphics</li>
<li>The only way to pass through the GPU output is to use an AIC (add in card) that can support the TB3 spec AND passthrough graphics</li>
</ol>
<h2 id="heading2">Seems simple right?!</h2>
<p>Sure if you can find a mobo and AIC… which turned out to be quite difficult. After a few hours of searching I was only able to find ONE AIC that was supported: <a href="https://www.newegg.com/Product/Product.aspx?Item=N82E16815548003">AsRock Thunderbolt 3 AIC</a> and amazingly in its <a href="ftp://asrock.cn/Manual/Thunderbolt%203%20AIC.pdf">manual</a> it even explicitly states support for 5k@60hz</p>
<p>I HAD to test this so I went ahead and purchased (referencing the manual) a compatible Z270 board and the AIC. I went with the AsRock Z270 Extreme4 (since it was mATX) and the AsRock AIC from newegg.</p>
<p>After throwing in my 960 pro, a titan XP, a single dim, and a 6700k, I plugged in the AIC with a special thunderbolt cable that inserts into a header on the motherboard. My guess here is that while the TB3 card is attached via PCIe the chipset also probably communicates with the AIC in some way for bandwidth allocation which is why that extra TB3 header is needed. The AIC failed to work on a motherboard without the special TB3 header.</p>
<p>I booted this system into windows 10, installed a TB3 driver from the AsRock website, and shutdown the system. I decided to plug in both of the two included cables (DisplayPort to DisplayPort and DisplayPort to Mini-Displayport) into the AIC. The logic here is that the single DP cable is restricted to 4K and in order to push the dual signal through TB3 I’d obviously need to attach two cables from the GPU to the AIC.</p>
<p>I plugged everything in, booted the system, and BOOM FULL 5k RESOLUTION 60HZ NO TINKERING!</p>
<p>The whole system works seamleslly and I’ve had no issues so far. Getting the full 10bit color support and the colors look fantastic. The speakers are still not cooperating (although my macbook running W10 can drive them no problem). I’ve got a hefty overclock on the Titan XP and was able to run GTA V at 5k at an average of 65 frames.</p>
<p>After installing TB3 drivers for arch I’m able to get this to run on arch linux as well so that’s yuugeee. Gnome is pretty good at display scaling and I use Atom so both do well on the HiDPI screen. The TB3 driver is an intel driver so I think if your distro has Tb3 drivers floating around this may work. Shoot me an email if there’s a distro you want me to test I’d be happy to give it a shot.</p>
<h2 id="heading2"> TL;DR</h2>
<p>You can drive the LG Ultrafine 5k in W10 using a Thunderbolt 3 AIC (AsRock one ATM is the only one I tested) and a motherboard with the Thunderbolt 3 header on it.</p>shantanujoshiTL;DR You can drive the LG Ultrafine 5k in W10 using a Thunderbolt 3 AIC (AsRock one ATM is the only one I tested) and a motherboard with the Thunderbolt 3 header on it.Booting into Linux from the Grub Prompt with an NVME Drive2016-09-09T00:15:00+00:002016-09-09T00:15:00+00:00https://shantanujoshi.github.io/grub-prompt-nvme<p>For the first time since my foray into Linux 4/5 years ago, I ran into the dreaded grub prompt at boot.</p>
<p>It looks a lot like this:
<img src="https://terminalinflection.com/wordpress/wp-content/uploads/2012/10/GRUB-Ubuntu.png" /></p>
<p>This basically means that the grub install is corrupted and GRUB doesn’t know how to boot.</p>
<p>There are quite a few tutorials on recovering from grub rescue mode, infact the image above includes most of the neccesary steps to booting into grub. But here’s my experience recovering from this prompt with an NVME Pcie SSD. In my first attempt to follow the steps below I had a kernel panic and had to start from scractch because I followed the steps to a regular recovery.</p>
<h2 id="heading2"> Steps to Recover from the Grub Prompt </h2>
<p>Here’s a step by step method to how I recovered from the grub prompt, fixed grub, and rebooted to write this post.</p>
<p><strong>Part 1: Getting Out of the Grub Prompt</strong></p>
<ol>
<li>Type <strong><code><span class="evidence">ls</span></code></strong> to see which drives are mounted</li>
<li>Type <strong><code><span class="evidence">ls (hd0,1)/</span></code></strong> and repeat this until you find the drive <strong><code><span class="evidence">(hdX, Y)/</span></code></strong> with your boot disk </li>
<li>Once you've found your boot disk type: <strong><code><span class="evidence">set root=(hdX,X)</span></code></strong> where X is your drive/partition number </li>
<li>The next part depends on your version of Linux (but if you're after version 12 you're good so keep reading)</li>
<li>Type <strong><code><span class="evidence">linux /vmlinuz root=/dev/nvme0nXpY </span></code></strong> where X is the X from hdX which is your boot drive, and Y which is the partition number on that drive that the boot partition.</li>
<li>initrd /initrd.img</li>
<li>boot</li>
</ol>
<p><strong>Part 2: Booting into Linux and Repairing GRUB</strong></p>
<p>After beginnning the boot process from the prompt I noticed a flickering TTY screen at boot due probably to my Nvidia drivers or some other permissions error. I knew to basically type <strong><code><span class="evidence">Ctrl + F1</span></code></strong> until I got get a TTY screen (which would disappear within 3 seconds due to the nature of this flickering bug, and could be brought back by pressing <strong><code><span class="evidence">Ctrl + F1</span></code></strong>)</p>
<p>So while fighting with the flickering TTY screen I had to enter my username, password, and the command <strong><code><span class="evidence">sudo service GDM stop</span></code></strong>. This command may be: <strong><code><span class="evidence">sudo service lightdm stop</span></code></strong> if you have a different display manager.</p>
<p>Basically my display manager was in a corrupted loop because the .Xauthority file’s permissions are altered either during the grub prompt boot. The permissions change to root effectively. As a result upon stopping the display manager I am able to access the TTY and preform the following steps:</p>
<ol>
<li> Type the command <strong><code><span class="evidence">update-grub</span></code></strong> in the tty </li>
<li> You should see a series of images to add to the grub menu, next enter <strong><code><span class="evidence">grub-install /dev/nvme0nXpY</span></code></strong> as per your prior boot drive and partition</li>
<li> Now to check .Xauthority permissions, type <strong><code><span class="evidence">sudo ls -lah</span></code></strong> in your home directory to see what the permissions settings are for .Xauthority file. In my case this was "root:root"</li>
<li> To repair .Xauthority type the following replacing username with your username <strong><code><span class="evidence">chown username:username .Xauthority</span></code></strong> in my case I typed shantanu:shantanu </li>
<li> Type <strong><code><span class="evidence">sudo reboot</span></code></strong> and hope for the best </li>
</ol>
<p>After this I waited for the grub screen and indeed it worked. The NVME0nXpY thing was neccesary for me to boot and subsequently type this post.</p>shantanujoshiFor the first time since my foray into Linux 4/5 years ago, I ran into the dreaded grub prompt at boot.Getting Pascal GPU’s to Work on Linux2016-09-04T21:08:00+00:002016-09-04T21:08:00+00:00https://shantanujoshi.github.io/nvidiabuntu<p>I’ve been running ubuntu on computers with discrete GPU’s for a few years now. Not ONCE has it ever been easy. The problem is that the Ubuntu default Nouveau drivers require time to update compatibility, and Nvidia drivers hate working with/disabling Nouveau. Given my itch to have the latest and greatest laptop hardware for the past few years I’ve had to deal with hacking away at the kernel/nouveau and nvidia beta drivers to get linux running on my various devices. Compatibility is generally introduced in kernel updates a couple months after the release of new hardware, but ain’t nobody got time for that.</p>
<p>Around a month or so ago, after a friend at facebook gifted me an Oculus Rift; as a result I decided to build my own computer. I bought the new Nvidia GTX 1080 and a month later also purchased the new Pascal Titan X (or Titan XP, thanks linustechtips) and decided to dual boot Ubuntu 16.04 because I thought it’d be interesting to test CUDA benchmarks with the new GPU’s. I also failed to find anyone post anything about their success or even failure with trying to run the GPU’s on linux.</p>
<p>After hours of failure getting the beta linux drivers from BOTH the PPA and the official Nvidia website link to work with my install I finally found a method that I think works 100% of the time when getting Ubuntu to work with dGPUs.</p>
<hr />
<h2> How to get new GPU's working with Ubuntu </h2>
<p>Note: the following steps expect a fresh Ubuntu install. If you’re trying to trouble shoot after installing, getting this to work would require at least being able to boot to TTY which may not be garunteed for certain installs. The basic framework is as follows:</p>
<ol>
<li> Install Ubuntu: to boot into the installer edit the boot parameters and add nomodeset before quiet splash</li>
<li> Boot into Ubuntu with nomodeset, run ALL updates and connect to the internet</li>
<li> (Optional) Restart the system, add nomodeset again, download the latest kernel and upgrade the kernel</li>
<li> Restart the computer, same setup with nomodeset, add the graphics PPA, then logout </li>
<li> Switch to TTY, login and stop the desktop envrionment service (gdm or lightdm depending on your OS)</li>
<li> Purge the nouveau driver</li>
<li> Install latest nvidia driver <strong><code><span class="evidence">nvidia-###</span></code></strong></li>
<li> Reboot to your OS</li>
</ol>
<p>These are the basics steps detailed tutorial below.</p>
<h2>0. Installing Ubuntu</h2>
<p>The steps to installing ubuntu vary based off of your needs. In my case I generally skip the preset options for installation and click “something else” where I can add my own partition scheme. I prefer to create a partition for <strong><code><span class="evidence">/</span></code></strong> and <strong><code><span class="evidence">/home</span></code></strong> generally alloting 30-50 gigs for / and the rest for /home on the drive. My tutorial assumes you’ve got a live disk of your chosen distro (personall running Gnome 16.04 because the UI is fantastic).</p>
<h2>1. Booting into the Live Disk</h2>
<p>In order to boot into the live disk for the install you need to highlight the “Intall Ubuntu” option at the live disk grub menu and type <kbd>e</kbd> to edit the commands and add <strong><code><span class="evidence">nomodeset</span></code></strong> right after “quiet splash”, I also recommend changing quiet splash to <strong><code><span class="evidence">noquiet nosplash</span></code></strong> but that’s optional</p>
<div class="side-by-side">
<div class="toleft">
<p><img class="image" src="http://www.tecmint.com/wp-content/uploads/2016/04/Ubuntu-16.04-Boot-Screen.png" alt="Alt Text" /></p>
</div>
<div class="toright">
<img class="image" src="https://www.maketecheasier.com/assets/uploads/2009/12/ubuntukarmic-edit-grub-entr.png" alt="Alt Text" />
</div>
</div>
<h2>2. Boot into Ubuntu</h2>
<p>After the install you should see the regular grub menu, once the grub menu appears type “e” again to edit the boot commands and type nomodeset as in step 1. Nomodeset is neccesary at boot because it disables certain video drivers which may fail with newer GPU’s.</p>
<p>Once logged in, connect to the internet and open up a terminal window and run: <strong><code><span class="evidence">sudo apt-get update</span></code></strong>, <strong><code><span class="evidence">sudo apt-get upgrade</span></code></strong>, and <strong><code><span class="evidence">sudo apt-get dist-upgrade</span></code></strong>. Some people tend to add <strong><code><span class="evidence">&&</span></code></strong> and <strong><code><span class="evidence">-y</span></code></strong> but I don’t really mind so I do them one at a time in case there are errors.</p>
<p>Lastly reboot the system and use nomodeset to login again. At this point you’re an expert at adding nomodeset to the boot commands so just assume anytime there is a reboot (unless otherwise stated) just add nomodeset to boot.</p>
<h2>2.a (Optional) Kernel Update</h2>
<p>I like to run the latest RC kernel because there’s generally better compatibility with new hardware but it’s not neccesarily 100% stable. More like 98% stable in my experience. Given the new hardware I atleast recommend preforming a kernel update to the latest kernel (non RC) to be safe but it’s optional.</p>
<p>Click <a href="http://kernel.ubuntu.com/~kernel-ppa/mainline/">here</a> to see the latest kernel builds:</p>
<ol>
<li>From the link above find the folder for the latest kernel (in my case at the time of writing this is 4.8-rc6 but 4.7 is the latest final release)</li>
<li> Depending on your distribution download 3 deb files for your distro. In my case I am running generic linux (default) and need the following deb files: <pre>linux-headers-4.8.0-040800rc6_4.8.0-040800rc6.201609121119_all.deb
linux-headers-4.8.0-040800rc6-generic_4.8.0-040800rc6.201609121119_amd64.deb
linux-image-4.8.0-040800rc6-generic_4.8.0-040800rc6.201609121119_amd64.deb</pre></li>
<li> Move the files to an empty folder, open terminal and cd to that folder</li>
<li> In order to install the kernels type: <strong><code><span class="evidence">sudo dpkg -i *.deb</span></code></strong> which will install each of the deb files you downloaded</li>
<li> After completing the install restart the computer and login</li>
</ol>
<h2> 3.Add the Graphics PPA </h2>
<p>Open terminal and type <strong><code><span class="evidence">sudo add-apt-repository ppa:graphics-drivers/ppa</span></code></strong> then run <strong><code><span class="evidence">sudo apt-get update</span></code></strong>.</p>
<h2> 4. Logout and Remove Nouveau</h2>
<p>Some people say removing nouveau isn’t neccesary, but in my experience it causes more problems when disabled and tends to restart itself during updates. As a result I recommend removing nouveau. In order to remove nouveau, disbale the desktop environment, and install the graphics drivers do the following:</p>
<ol>
<li> Logout of your account and at the login screen type <kbd>CTRL</kbd>+<kbd>ALT</kbd>+<kbd>F1</kbd> to switch to "tty1"</li>
<li> Type your username and password into the console to login to your account</li>
<li> Depedning on your desktop environment, type either <strong><code><span class="evidence">sudo service gdm stop</span></code></strong> or <strong><code><span class="evidence">sudo service lightdm stop</span></code></strong> to disable the desktop envrionment</li>
<li>Next remove nouveau with by typing: <strong><code><span class="evidence">sudo apt-get --purge remove xserver-xorg-video-nouveau</span></code></strong></li>
<li> Now that nouveau is removed, we can go ahead and install the latest nvidia driver by typing <strong><code><span class="evidence">sudo apt-get intall nvidia-</span></code></strong> and pressing <kbd>TAB</kbd> to see the drivers available</li>
<li> In my case the latest driver (highest number after nvidia-) was "nvidia-370" so I installed the driver by typing <strong><code><span class="evidence">sudo apt-get intall nvidia-370</span></code></strong></li>
</ol>
<h2> 5. Reboot the OS and Test</h2>
<p>The final step is to type <strong><code><span class="evidence">sudo reboot-</span></code></strong> to restart the computer. This time you <strong><em> do not</em></strong> need to type nomodeset at the grub menu. If all goes well you should be able to boot into the OS without any problems.</p>
<p>So far I have been able to successfully run the latest nvidia GPUs on the following systems multiple times with these steps:</p>
<p>Desktops:</p>
<table>
<thead>
<tr>
<th style="text-align: left">Motherboard</th>
<th style="text-align: left">Proccessor</th>
<th style="text-align: left">GPU</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left">Asus Maximus VIII Gene</td>
<td style="text-align: left">i7 6700k</td>
<td style="text-align: left">Titan XP</td>
</tr>
<tr>
<td style="text-align: left">Asus Z170 Pro Gaming</td>
<td style="text-align: left">i7 6700</td>
<td style="text-align: left">GTX 1080 and 1070</td>
</tr>
<tr>
<td style="text-align: left">Gigabyte Z170 Wifi</td>
<td style="text-align: left">i5 6600k</td>
<td style="text-align: left">GTX 1070</td>
</tr>
</tbody>
</table>
<p>Laptops:
Dell XPS 15 9550, Surface Book w. DGPU (both of these needed additional tweaks to install and run a dual boot but the graphics card install was identical)</p>
<p>If the install fails I recommend the following trouble-shooting steps:</p>
<ul>
<li>If you get the login screen but can't get the desktop, swithc to TTY and check the permissions of the .Xauthority file in your home folder</li>
<li>If you're getting a flickering screen and no login screen, switch to TTY and try to stop the display manager and repair the install there</li>
</ul>shantanujoshiI’ve been running ubuntu on computers with discrete GPU’s for a few years now. Not ONCE has it ever been easy. The problem is that the Ubuntu default Nouveau drivers require time to update compatibility, and Nvidia drivers hate working with/disabling Nouveau. Given my itch to have the latest and greatest laptop hardware for the past few years I’ve had to deal with hacking away at the kernel/nouveau and nvidia beta drivers to get linux running on my various devices. Compatibility is generally introduced in kernel updates a couple months after the release of new hardware, but ain’t nobody got time for that.Modding an AIO Water Cooler for the Titan XP (Pascal)2016-09-02T18:21:00+00:002016-09-02T18:21:00+00:00https://shantanujoshi.github.io/titanxp-watercooling<p>I recently got my hands on a new Pascal Titan X, and while the performance of the card is phenomenal it has terrible idle and load temps. In this post I’ll describe my steps in modding a hybrid water cooling kit that’s fit for the GTX 1080 to work with the new Titan X. <em>Scroll down to skip to the tutorial</em></p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/1.JPG" alt="Titan XP" />
<p class="image-caption">Titan XP</p>
</div>
<h2> Why water cool? </h2>
<p>Nvidia’s default blower-style coolers are relatively inefficient, and unfortunately AIB’s (Add-in Board Partners) are not allowed to release custom cooling solutions or modify the reference PCB Nvidia creates for the Titan series.</p>
<p>As a result the only solution to better performance and temperatures is water cooling. There is a large community of PC enthusiasts that build custom water cooling loops for their systems. Unfortunately custom loops are expensive and cumbersome. In addition they tend to make upgrading quite difficult. The next solution to water cooling a graphics card is a “hybrid-kit”. Basically an all in one water cooler that attaches to the GPU PCB and creates a water cooling loop.</p>
<p>EVGA (an AIB that releases custom hybrid kits) does not have a water cooling kit for the new Titan X. The only option is to modify a water cooling kit made for the Titan X’s younger sibling the GTX 1080.</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/2.JPG" alt="EVGA Hybrid" />
<p class="image-caption">EVGA 1080 Hybrid Kit</p>
</div>
<h2> How to Mod the EVGA 1080 Hybrid Kit for the Titan X </h2>
<p>Installing the kit with or without mods first requires complete removal of the original cooler from the graphics card.</p>
<p><em>Warning: this will void your warranty with nvidia, but if you put the original cooler back on there’s no way for them to know you’ve removed it. I’m not saying I recommend doing so but I’m also saying there are no tamper-proof screws…</em></p>
<p><strong>Note before continuing:</strong></p>
<ul>
<li>Keep careful track of all the screws, you'll need them in the future </li>
<li>Try to find a dremel tool, if you don't have one it's a perfect opportunity to pick one up, I'll specify the attachment bits used in each step</li>
<li>If you're afraid or unsure of removing the GPU cooler from the PCB I recommend watching this <a href="https://www.youtube.com/watch?v=H7HN3CDxMQk">video</a> detailing the teardown of the Titan X</li>
<li>You must ground yourself somehow while doing this. Click <a href="https://www.tomshardware.com/faq/id-2121341/ground-building-computer.html">here</a> to learn why it's necessary</li>
<li> This <a href="http://forums.evga.com/10801070-Reference-Hybrid-AIO-Kit-Install-Manual-for-PN-400HY5188B1-nonFTW-m2515115.aspx">link</a> has instructions from EVGA on installing the cooler on a 1080. Save them you'll need this later on.</li>
</ul>
<h2>Step 1: Remove the IO Plate</h2>
<p>Start by removing the 5 screws on the IO plate, the two screws for the DVI port, and the two screws on the back plate holding the IO plate in place.</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/3.JPG" alt="Titan X IO Plate" />
<p class="image-caption">Remove this IO Plate</p>
</div>
<h2>Step 2: Unscrew the Cooler</h2>
<p>Get yourself two allan/hex keys/bit, a tiny phillips 00, and a larger philips head. I didn’t take pictures of the cooler removal given that there’s no real trick here. Here’s the best order of operations for the removal:</p>
<ol>
<li>Remove the back plate by carefully unscrewing the TINY screws holding it in place (one part of the backplate slides out of the other)</li>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/backplategone.JPG" alt="Backplate Removed" />
<p class="image-caption">Backplate Removed</p>
</div>
<li>Use the smaller allan key to remove all the screws surrounding the cooler, the four larger screws next to the clear window are only there to hold the window to the plate itself (see photo of larger screws)</li>
</ol>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/4.JPG" alt="Titan X Cooler Screws" />
<p class="image-caption">Remove all but the optional circled screws</p>
</div>
<h2>Step 3: Remove the Cooler</h2>
<p>Again I didn’t think this step needed documentation. The best way to do it is remove the section with the window first, you will experience some resistance from the thermal compound on the heatsink but ignore it. Next remove the second cooler, be careful of the wire connecting the fan to the PCB and remove it carefully. Once the entire cooler is removed and you’ve wiped the GPU surface with a microfiber cloth dipped in some alcohol…</p>
<p>Watch out for these wires when removing the two parts of the cooler</p>
<div class="side-by-side">
<div class="toleft">
<p><img class="image" src="/assets/images/TitanXPMod/blowerfanwire.JPG" /></p>
</div>
<div class="toright">
<p><img class="image" src="/assets/images/TitanXPMod/ledwire.JPG" /></p>
</div>
</div>
<p>It should look something like this:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/5.JPG" alt="Titan X PCB" />
<p class="image-caption">Nvidia GP102</p>
</div>
<h2>Step 4: Modifications to the EVGA Kit</h2>
<p>The evga kit consists of 3 major parts: the AIO water cooler, a pcb plate, and a top cover. The AIO cooler fits onto the GPU perfectly as both the Titan X and the 1080 use the GP102 chip, the only issue is mounting the PCB plate with the fan onto the Titan X PCB. The plate needs to accommodate for the following features of the Titan X:</p>
<ol>
<li> Extra Capacitors</li>
<li> S6-pin power connector </li>
<li> Additional Phase Chip </li>
</ol>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/6.JPG" alt="Additional PCB Items" />
<p class="image-caption">Additional Titan X Components</p>
</div>
<h2>Step 5: Make Room for an Extra 6-Pin</h2>
<p>I started with this as it seemed the simplest, I used a cutting tool and eyeballed the additional cuts needed for the plate. The removed section is highlighted below:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/7.JPG" alt="Space for 8+6Pin" />
<p class="image-caption">Area Removed for 6-Pin</p>
</div>
<h2>Step 6: Accomodating the Extra Phase Chip</h2>
<p>Cut out a section for the extra phase chip. This part is tricky, I started by using a cutting tool to cut out a square a bit smaller than the area I thought I needed (all eyeballed). After which I brought the plate close to the PCB to see how much more metal I would have to remove and used a grinding bit to cut away slowly at the plate.</p>
<p>Here are my results:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/8.JPG" alt="Cuts for Phase Chip" />
<p class="image-caption">Area Removed for Extra Chip</p>
</div>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/9.JPG" alt="Larger View of Cuts" />
<p class="image-caption">Wider View of the Cut for Scale</p>
</div>
<h2>Step 7: Grinding a Space for the Capacitors</h2>
<p>This is the aftermath of the panel after I grinded away the metal above the capacitors. I highly recommend using a dremel with the following <a href="https://www.dremel.com/en-us/Tools/Pages/ToolDetail.aspx?pid=541">bit</a>. It’s included in most sets and makes it very easy to grind the surface away.</p>
<p><strong><span class="evidence">Note: Definitely test fit the plate to see if it’s flush with the PCB and keep repeating Step 5-7 until it’s flush.</span></strong></p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/11.JPG" alt="Slightly Zoomed Out" />
<p class="image-caption">Slight Collateral Dremel Damage on the Sides</p>
</div>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/TitanXPMod/10.JPG" alt="Side View of Dremel Cut" />
<p class="image-caption">Plenty of Space for the Capacitors</p>
</div>
<h2> Step 8: Reassembly</h2>
<p>The steps here on are the exact same as the tutorial from EVGA. The top plate that we grinded and the water cooler should easily attach to the pcb. Again follow this <a href="http://forums.evga.com/10801070-Reference-Hybrid-AIO-Kit-Install-Manual-for-PN-400HY5188B1-nonFTW-m2515115.aspx">link</a> for details on reassembly.</p>
<h2> Step 9: Final Product!</h2>
<p>See any pictures online of any hybrid card. The only thing differentiating this is that the PCB is low key a Titan :wink:</p>
<h2> Temps/Results </h2>
<p>This card on water is absoultely insane. Detailed benchmark on the way…</p>shantanujoshiI recently got my hands on a new Pascal Titan X, and while the performance of the card is phenomenal it has terrible idle and load temps. In this post I’ll describe my steps in modding a hybrid water cooling kit that’s fit for the GTX 1080 to work with the new Titan X. Scroll down to skip to the tutorialBenchmarking the Sieve of Eratosthenes2016-08-19T05:21:00+00:002016-08-19T05:21:00+00:00https://shantanujoshi.github.io/soe<p>Advanced concepts in computer science can sometimes be difficult to understand and difficult to execute. A great example of this is tail recurison in functional programming. The conept is easy to explain, and the notion of it running in constant stack space is easy to accept. However building programs and algorithms to be tail recursive can be a difficult task that isn’t very easy to teach. In a similar fashion explaining Floydd Warshall’s, in my opinion, is much harder than writing the simple program that implements it.</p>
<p>Learning the Sieve or Eratosthenes, an anchient and intuitive algorithm that generates prime numbers, reminded me of my first time learning about summing up numbers with for-loops in an intro Java class in high school. I felt this extreme sense of confidence and comfort with programming that began fading away after my first college lab assignment in C. The <span style="color:#1abc9c">SOE</span> just clicks in a way not many <em>“advanced”</em> concepts do in CS. (Yes I’m abbreviating Sieve of Eratosthenes to save 21 characters you’re welcome).</p>
<h2 id="heading2">Naive Algorithm to Test for Primes</h2>
<p>Testing for a prime number is tricky; here’s a simple slow method to start with:</p>
<pre><strong>Given the number x if any prime integer from 2 to sqaureroot(X) evenly divides n it's NOT prime.</strong></pre>
<p>This is reallllly slow. If your number is massive this method gets even worse. Regardless it’s probably a great intro to programming homework problem (hint hint CS101 profs) but in the business of primes we’d need something faster. Enter <span style="color:#1abc9c">SOE</span>…</p>
<h2 id="heading2">The Sieve</h2>
<p>The Sieve of Eratosthenes was developed by a Greek mathematicions somewhere around 240ish BC.</p>
<p>I was actually taught the sieve in a course preparing students for programming competitions where we were learning methods of Prime number generation. My love for prime numbers is non-existent, but for some odd reason I found this singular algorithm infuriatingly beautiful. Sure, it doesn’t have the overwhelming existential impacts of max flows and single shortest paths but something about the simple and intuitive nature of the <span style="color:#1abc9c">SOE</span> differentiated it from the likes of every other algorithm in this class. I found myself revisiting the algorithm two weeks later in a parallel computing course attempting to translate my java to c code and running it through OpenMP for a lab.</p>
<hr />
<h2 id="heading2">The Acutal Algorithm</h2>
<p>The goal is to create a list of prime numbers that we can reference at constant time, ideally we can generate this list more quickly than the naïve approach</p>
<p>Here’s the algorithm:</p>
<ol>
<li>Generate a list of integers from 2 to some limit (for our purposes let's use 21)</li>
<pre><strong>2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21</strong></pre>
<li>Since the first number of this list is 2, cross out every <em>2nd</em> number on the list after 2</li>
<pre><strong>2 3 <del>4</del> 5 <del>6</del> 7 <del>8</del> 9 <del>10</del> 11 <del>12</del> 13 <del>14</del> 15 <del>16</del> 17 <del>18</del> 19 <del>20</del> 21 </strong></pre>
<li>The next number is 3 so same idea, cross out ever <em>3rd</em> number on the list after 3</li>
<pre><strong>2 3 <del>4</del> 5 <del>6</del> 7 <del>8</del> <del>9</del> <del>10</del> 11 <del>12</del> 13 <del>14</del> <del>15</del> <del>16</del> 17 <del>18</del> 19 <del>20</del> <del>21</del></strong></pre>
<li>5 is the next number, however no multiples of 5 remain to be crossed out</li>
<li>We continue until there are no more numbers left to count off of and there's your list of primes (which is right now) and what we are left with is our list of primes</li>
<pre><strong>2 3 5 7 11 13 17 19</strong></pre>
</ol>
<h2 id="heading2"> Simple right?</h2>
<p><strong><span style="color:#EB298C">YES!</span></strong> I think that’s the beauty of it. I’d like to take this simple concept and see how far I can attempt to speed it up and benchmark it.</p>
<p>Watch this GIF that illustrates the algorithm perfectly (thanks Wikipedia) reload the page if it’s finished.</p>
<p><img alt="Sieve of Eratosthenes animation" src="https://upload.wikimedia.org/wikipedia/commons/0/0b/Sieve_of_Eratosthenes_animation.svg" /></p>
<p>And that’s the <span style="color:#1abc9c">SOE</span>. There are faster variations (see Euler’s Sieve and this <a href="https://www.cs.utexas.edu/users/misra/scannedPdf.dir/linearSieve.pdf">paper</a>). <em>But this algorithm is relatively easy to explain and understand.</em></p>
<!---
<hr/>
<h2 id="heading2"> Here's the sieve in Java: </h2>
<h2 id="heading2">Here's the implementation in C: </h2>
Parallelization? Is this a word? Making to parallels codes:
This is not a guide to parallelizing code, If you're interested in learning I can point you to my amazing college professor for the course I took; his course material is all in comic sans on a jarring neon background, but his lectures were fantastic.
<h2 id="heading2">This is probably not a great CUDA task:</h2>
-Running the sieve with OpenMP or MPI makes a lot of sense, I do think that achieving a speedup from GPU parallelizations might be difficult given memory restrictions in cuda but I decided to investigate.
-Runnign the sieve on CUDA probably doesn't make direct sense given the speedup that induced by GPU parallelization. But I have a brand new Water Cooled Psacal Titan X from Nvidia and it would be a crime to leave all those CUDA cores to rendering GTA V at 4K Resolution and training baby predictive models for my fantasy football league… So I'm going to try to find a way to utilize the GPU cuda cores to generate primes.
-->shantanujoshiAdvanced concepts in computer science can sometimes be difficult to understand and difficult to execute. A great example of this is tail recurison in functional programming. The conept is easy to explain, and the notion of it running in constant stack space is easy to accept. However building programs and algorithms to be tail recursive can be a difficult task that isn’t very easy to teach. In a similar fashion explaining Floydd Warshall’s, in my opinion, is much harder than writing the simple program that implements it.Compresssing Images in Python2016-07-31T15:08:00+00:002016-07-31T15:08:00+00:00https://shantanujoshi.github.io/python-image-compression<p>While working on website performance tuning for some part time work I needed to find a way to compress 100,000+ images for an ecommerce website’s catalog. The current solution to image compression was a “designer” who would bulk process images in photoshop with a macro. He had worked for 4 days straight and estimated a 2 week completion time for processing all the pictures.</p>
<p>The images were simply too large in file size and the wrong dimensions. Each image had to be both scaled to fit within fixed square pixel dimensions and reduced in file size.</p>
<p>I wrote a pythons script (attached below) to compress the files for this specific task. But I thought I’d rewrite it to work for compressing images for this blog given that having a post with 15 10mb images would not be great for load times.</p>
<p>Here’s my version of the image compression script built for compressing images in a given directory to a managebale size using Pil (or Pillow) a python library that does most of the work here.</p>
<p><strong> If you’re looking for something that can also change image dimensions see <a href="https://gist.github.com/ShantanuJoshi/44e9b72a985d5d6b4e8df2810ce5d25e">here</a> </strong></p>
<h2 id="heading2">CompressMe.py</h2>
<script src="https://gist.github.com/ShantanuJoshi/23ac55479ab9a613230bd9467d080f33.js"></script>shantanujoshiWhile working on website performance tuning for some part time work I needed to find a way to compress 100,000+ images for an ecommerce website’s catalog. The current solution to image compression was a “designer” who would bulk process images in photoshop with a macro. He had worked for 4 days straight and estimated a 2 week completion time for processing all the pictures.Salvaging a Broken Lenovo Yoga 9002016-07-06T14:24:00+00:002016-07-06T14:24:00+00:00https://shantanujoshi.github.io/lenovo-yoga-screen-repair<p>Due to various random circumstances I was given a Lenovo Yoga 900 with a shattered screen for free, the laptop is worth around $1200 new, but this guy had a broken screen and no charger. After getting a replacement charger and booting the computer I learned that it had an i7 processor and 16 gigs of ram and a high resolution display :flushed:. I decided to find a replacement screen on ebay for $300 and try to fix the laptop myself. Given that this was when the laptop just launcheed I had no reference or teardowns and no one seemed to have taken apart the display assembly but here’s my attempt to replace the screen.</p>
<h2> Removing the Old Panel</h2>
<p>I found out that the screen I purchased was JUST the led assembly. So I tried to use a heat gun to melt the adhesive spereating the LED from the display housing itself and here’s what I ended up with:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/1.jpg" alt="Broken Screen" />
<p class="image-caption">Display Housing Removed from the Panel</p>
</div>
<h2> Now what? </h2>
<p>Yes the entire display assembly is shattered and there’s no way of reattaching the new panel to the housing.. but all the housing does is hold a few sensors, the webcam, and some antennas in place while also adhering the display itself to a back plate that attaches to the laptop itself. I decided that thick double sided foam tape would be more than enough to replace the side display housing and attach the screen to the back plate.</p>
<p>In addition I just used the double sided tape to attach the sensors I removed directly to the replacement display panel. Here’s what it looks like:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/2.jpg" alt="Adding sensors" />
<p class="image-caption">Adding the Webcam/Sensors/Antennas</p>
</div>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/3plus.jpg" alt="Side View" />
<p class="image-caption">The Entire Display</p>
</div>
<p>Not too bad…</p>
<h2> Filling the Space of the Display Assembly</h2>
<p>In order to fill the missing space and attach the display to the back plate (which is attached to the laptop hinge) I used strong 3m double sided foam tape. I stacked the tape in order to make sure that the housing was evenly attached and sort of eyeballed the amount. Here’s the taped up display: (notice the gray tape alone the edges)</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/3.jpg" alt="Display w. Tape" />
<p class="image-caption">Gray 3m Tape Along the Display Edges, Back Plate Underneath</p>
</div>
<h2> Attaching Everything</h2>
<p>As I attached the display to the housing I quickly realized that I had made a huge mistake attempting to stack the 3m tape. Not realizing that the back panel is slightly curved accoutning for the display itself, I used absolutely double the amount of tape I needed to fill this gap. And as a result here’s what the side of the laptop looks like:</p>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/4.jpg" alt="Side View" />
<p class="image-caption">Foam Tape Bezels</p>
</div>
<h2> It definitely still works...</h2>
<!-- from:https://superdevresources.com/image-caption-jekyll/ _includes/image.html -->
<div class="image-wrapper">
<img src="https://shantanujoshi.github.io/assets/images/yogafix/5.jpg" alt="Side View" />
<p class="image-caption">Windows 10 Testing</p>
</div>
<h2> Aftermath</h2>
<p>I decided to dual boot linux on this laptop and used it as a primary computer for a while. I love the form factor but defintiely didn’t need another computer. I decided to sell it and included the fact that the display bezel was missing from the price. The broken laptop would’ve sold for around $200-$300 according to sold listings on Ebay. After fixing the screen I was able to get $700 so no complaints there.</p>shantanujoshiDue to various random circumstances I was given a Lenovo Yoga 900 with a shattered screen for free, the laptop is worth around $1200 new, but this guy had a broken screen and no charger. After getting a replacement charger and booting the computer I learned that it had an i7 processor and 16 gigs of ram and a high resolution display :flushed:. I decided to find a replacement screen on ebay for $300 and try to fix the laptop myself. Given that this was when the laptop just launcheed I had no reference or teardowns and no one seemed to have taken apart the display assembly but here’s my attempt to replace the screen.NYC Mechanical Keyboard Meetup2016-05-26T12:12:00+00:002016-05-26T12:12:00+00:00https://shantanujoshi.github.io/nyc-keyboard-meetup<p>I recently hosted a mechanical keyboard meetup in NYC for keyboard enthusiasts and those wanting to learn about the hobby. The details of the meetup were posted on reddit and in our meetup group <a href="https://www.meetup.com/Click-clack-Mechanical-Keyboard/events/230786767/">here</a>. I was incredibly surprised to find out that nearly 50 people heard about the meetup and attended. We even had Steph and Pete from <a href="1upkeyboards.com">1upkeyboards</a> stop by to showcase some of their custom cables.</p>
<p>If you’re curious about keyboards I recommend checking out the <a href="reddit.com/r/mechanicalkeyboards">subreddit</a> and perhaps even the <a href="geekhack.org">geekhack forums</a>.</p>
<h2> Here's some pictures from the event. </h2>
<p>The first album is taken by me the other two are a couple other attendees. (click to imgur for high-res pictures)</p>
<center>
<blockquote class="imgur-embed-pub" lang="en" data-id="a/Uorou"><a href="//imgur.com/Uorou">NYC 5/26 Mech Meetup</a></blockquote><script async="" src="//s.imgur.com/min/embed.js" charset="utf-8"></script></center>
<center>
<blockquote class="imgur-embed-pub" lang="en" data-id="a/rFkQe"><a href="//imgur.com/rFkQe">NYC Mechanical Keyboard Meetup</a></blockquote><script async="" src="//s.imgur.com/min/embed.js" charset="utf-8"></script></center>
<center>
<blockquote class="imgur-embed-pub" lang="en" data-id="a/QyKxG"><a href="//imgur.com/QyKxG">Mechanicahl Keyboards NYC Meetup</a></blockquote><script async="" src="//s.imgur.com/min/embed.js" charset="utf-8"></script>
</center>shantanujoshiI recently hosted a mechanical keyboard meetup in NYC for keyboard enthusiasts and those wanting to learn about the hobby. The details of the meetup were posted on reddit and in our meetup group here. I was incredibly surprised to find out that nearly 50 people heard about the meetup and attended. We even had Steph and Pete from 1upkeyboards stop by to showcase some of their custom cables.Doge2016-01-23T22:10:00+00:002016-01-23T22:10:00+00:00https://shantanujoshi.github.io/doge<p><img src="http://i.imgur.com/sx44AEd.jpg" alt="Screenshot" /></p>shantanujoshiFirst Post Readme2016-01-01T00:00:00+00:002016-01-01T00:00:00+00:00https://shantanujoshi.github.io/first-post-readme<p>This is my blog, website, braindump.</p>
<p>I post projects here. And stuff I’ve built. And stuff I want to share with people because I still can’t figure out twitter :unamused:</p>shantanujoshiThis is my blog, website, braindump.