It's time for me to get a new Laptop.
I've been using a Macintosh laptop for a number of years now and generally like the system. The installation of DSS community edition is fairly straightforward on a Mac.
That said I'm open to a new configuration. And ready to go back and even look at the basic question Mac vs PC (or Linux?) I know that in the past the suggestion from Dataiku for DSS when it came to a PC was to use Linux on a virtual box. (This did not work for me when I tried it last back in 2017.)
I just found this post from @Alex_Reutter about using Windows 10 WSL (Windows Sub System for Linux). It's been a while since this post was originally made. If you are around @Alex_Reutter I would love to hear what your current experience is. Is WLS a viable alternative to a Macintosh for DSS on a MS Windows PC? Hows the performance?
Now to the question of what I'm looking for from my new computer:
It appears that I don't have the ability to get a Cuda / Pytorch supported GPU on a Macintosh. (Thanks Apple & Nvidia.) Where something like the new Surface Book 3 provides an Nvidia GPU built-in. However, it is generally designed to run Windows.
I'm wondering if anyone has any experience with WSL 2 and DSS on MS Windows 10. I've also noted a presentation at MS Build 2020 about the forthcoming GPU support for WSL 2.
Or do folks have any ideas about how to get generally unlimited access to GPU without lots of incremental costs from a Macintosh?
Thanks for any thoughts you might share.
@GreaseMonkey Thanks for the reply.
On what type of hardware have you been using LinuxMint and DSS?
I'm thinking about what to replace my 2013 13-inch Macbook Air. (1.7Ghz. i7, 8GB RAM, 512GB SSD) Recently it has been working hard to keep up.
I run a desktop system at home that I built: ASRock X99 Extreme4 mobo w/ a cheapo NVidia Card.
I also used to have an HP Omen laptop that ran really well. It only had 12G RAM, but it did great. Touchpad, 4k screen, wireless, sound and everything on it worked out of the box.
System 76 has some decent looking hardware and their own Linux distro.
Did you wind up figuring this out?
I had been a Windows user for years (primarily because that is what my IT team supports) and had been using WSL for a while. Pretty straightforward to get DSS set up on WSL (that's where I started). I've also had some experience with OSX and Linux (ubuntu specifically) a few years back.
I would also encourage you to try a linux distro. For the past few months I have switched back to Ubuntu 18.04 on my personal desktop and have become quite fond of it . I have also read recently that Ubuntu 20.04 LTS is a big upgrade over 18.04. I will say, I do miss the some of the integration with the Microsoft ecosystem (Teams/OneDrive/etc/) that are required for collaborating with my company - but most of them are available through web clients anyways. Other than that I'm having a blast.
The primary reason I decided to hop back to linux is that I like to tinker with a lot of open source tools and in my experience running them on WSL or Docker for Windows was that they would "generally" work but every now and then I'd run into environment/setup issues that become a pain. When self-learning those tools there is also vastly amount more content on the interwebs for Linux than Windows.
This was also the case as I was doing some GPU accelerated deep learning with Tensorflow locally. If your GPU learning is specific to DSS it sounds like Linux may be the way to go. If you are interested in GPU accelerated learning just for fun (outside of DSS) Google Colab offers free notebook enviroments with access to a GPU. If it
Best of luck in your search. Let us know what you decide. (and at least the crypto craze has passed and you can actually get a GPU for a reasonably normal price)
I'm leaning toward a 13-inch 2020 Macbook Pro (4 USB C port model). Because it just works as a daily driver and as a family, we are invested in the Apple ecosystem.
I hear you about the library issues not working well on Windows or Mac OS. I'm starting to run into this on my older MacBook Air.
That said I will almost certainly make a new Macbook Pro at least dual boot Windows with Bootcamp, and use that partition with either Parallels or VMware Fusion. (I've been doing this for years. That will give me either emulated hardware or non emulated direct hardware access to MS Windows 10 with WSL2 and Terminal.)
I've also considered trying to set up a triple boot with Linux as the third OS. (That would increase the disk space requirements for the overall configuration.) I'm not clear if Parallels or VMware Fusion can book a Linux Partition on a Mac in the same way that they can boot a BootCamp Partition. (I don't really want to use one of the Virtualization Tools disk images, because I can't boot the computer just to the Linux OS from one of those.)
This setup would allow me to "hop onto either Windows or Mac OS" and maybe even one of the Linux Distros.
The real problem with that Mac as a choice is that I do not get Nvidia GPU, and of course the Apple Tax. Apple only seems to want to support AMD right now. (There are some hacks that "Might" work with eGPUs or PlaidML I'm not seeing very clear evidence that folks are having good success.) Best I can tell the state of the Art DL is only being done with NVidea GPUs. AMD and Apple don't see to have good answers in this space for developers.
Going with a Non-Apple Intel-based laptop gives me at best an "it sort of works" Hackintosh type solution for the Double Boot or Triple Boot Gaming laptop. That said I'd like the daily driver part of the computer to be Solid and well-integrated into the rest of my life.
So, If I go with the Macbook Pro, I may go to someone like PaperSpace and rent ML / DL optimized access to NVidea GPUs. I'm aware of Google Colab as well. One of these will work with fast.ai.
I've also considered doing a single board computer from NVidea called the NVIDIA® Jetson Xavier NX™ Developer Kit. Use Docker or Kubernetes to load jobs to that GPU from my laptop for overnight type runs on NVIDIA hardware. If this works well. Then create a bit of a cluster out of several devices like this.
Finally, I'm planning to attend/listen to WWDC 2020 coming up in a few weeks before making a final decision. There are some rumors that Apple will be switching from Intel Processors to ARM platform in the coming years. I'm also interested in their solutions when it comes to DL compute. If I'm going to hear anything useful about this in the Apple ecosystem, WWDC will be the place.
Thanks, I'll reach out when I get this all resolved.
With Apple's announcements today. (They are going away from Intel Processors. To their own ARM-based processors.) Hmmmmm. Now, where do I go?...
I feel myself less inclined to drop major bucks on a high-end Macbook Pro 16. Because in 6 months the architecture will be different. And in 2.5 years the current computer will be moving on toward obsolescence.
There is a piece of me thinking about jumping to the bleeding edge and get a copy of their developers Mac Mini. I have to see what strings are attached and the price.
I know that DSS does not yet support Apple ARM architecture. However the new Neural Network apparently supports the ML cores of the M1 chip. Including Tensor flow. I find that an interesting outcome.
A quick update.
For now, I've postponed getting a new laptop. For now, I'll use my old i7 Core Duo Macbook Air. It's Ok-ish for mobile activity.
Although M1 Mac's looks super interesting, most of the Data Science Software is still getting its act together regarding compatibility with Apple ARM chips.
For now, I've gotten one of the older Intel Mac Minis Refurbished from Apple. This meets my short term needs. And over the long run will make a good home server.
Today I discovered the Kubuntu Focus M2 it appears to be an interesting Data Science Laptop with NVidea RTX 2060, 2070, or RTX 2080S. Looks sort of interesting. Is there anyone else out there using this computer?