←back to thread

Ubuntu on Windows

(blog.dustinkirkland.com)
2049 points bpierre | 1 comments | | HN request time: 0.204s | source
Show context
bcantrill ◴[] No.11392265[source]
This is great to see, as it's very similar to the approach that we took with LX-branded zones on SmartOS[1][2]. I commented at some length on the other thread on this on HN[3], but I have a bunch of questions about apps that we know to be thorny: Go, strace, tcpdump, systemd, etc. As we learned, this approach is entirely possible -- but there are many, many details to be nailed before you get to the point that you can run production applications on it. So while the journey across the uncanny valley of Linux is long and arduous, we know from our experience that it can be done. Very much hoping that Microsoft gets to the other side -- and that they open source it all so we can all learn from one another!

[1] http://www.slideshare.net/bcantrill/illumos-lx

[2] http://us-east.manta.joyent.com/patrick.mooney/public/talks/...

[3] https://news.ycombinator.com/item?id=11392119

replies(5): >>11392299 #>>11392418 #>>11393777 #>>11395242 #>>11395345 #
crudbug ◴[] No.11392299[source]
I think M$ is targeting more developers with *NIX background on the desktop side rather than Linux apps on server. So a bash support with uniform CLI is the end-game.
replies(2): >>11392350 #>>11393097 #
talawahdotnet ◴[] No.11392350[source]
Yea, I think they are going after developers who use OS X because it is UNIXy. Smart move given how en vogue Apple laptops have become for developers these days.
replies(9): >>11392457 #>>11392525 #>>11392717 #>>11392840 #>>11392967 #>>11393415 #>>11394345 #>>11394921 #>>11395006 #
Delmania ◴[] No.11392717[source]
>Yea, I think they are going after developers who use OS X because it is UNIXy

It's a bit more than UNIXy, (the proper term is Unix-like), it literally is UNIX. It meets the UNIX 03 specifications.

Also, the motivations for these move predate the rise in popularity of Apple. For years, one of the biggest complaints about Windows was the lack of a good command line interface. There was the legacy CMD.EXE, which provided support for DOS commands and batch files, and PowerShell, which people either love or hate. The reality, however, is that overwhelmingly, the combination of bash/zsh and coreutils, binutils, util-linux, etc. won out a long time ago. Most schools use a flavor of Linux (maybe Solaris) for teaching Computer Science (and related disciplines), so many people who have formal training are used to those. Those people tend to teach other people to use them, etc.

Some people bemoan the fact that the CLI never evolved past its UNIX origins, but the reality is these tools work just fine. There's never been a reason to evolve them.

http://www.opengroup.org/openbrand/register/brand3612.htm

replies(3): >>11393795 #>>11394663 #>>11395088 #
Someone ◴[] No.11393795[source]
"the reality is these tools work just fine. There's never been a reason to evolve them."

On the contrary; there have been many valid reasons to evolve them, but backward compatibility was deemed more important.

Example #1: it is possible to write a sh/csh/bash/?sh script that handles file names with spaces, slashes, quotes, question marks, etc, but one would hope that would be made a bit easier, almost half a century later.

Example #2: the hack that is xargs for handling large numbers of arguments. To write a truly robust script that handles directories with an arbitrary number of files, one should run a pipeline using find and xargs, instructing xargs to do the actual work (and you cannot even use find and xargs with their default settings; you need -print0 and -0 flags to handle file names with spaces, etc)

If programs received arguments unexpanded, and the system had a library for expanding arguments, many use cases would become a lot simpler, and scripts could become more robust.

And yes, that could have been evolved. Headers of executables could easily contain a bit indicating "I'll handle wild-card expansion myself".

Example #3: man pages, IMO, should be stored in a special section inside binaries. That ensures that the man page you read is the man page for the executable you have.

Example #4: http://unix.stackexchange.com/questions/24182/how-to-get-the... shows that things _have_ evolved. Reading and parsing /etc/mtab isn't a reliable way to find all mount points, just as reading /etc/passwd file isn't the way to find password hashes anymore, ar has long been upgraded to support file names longer than 14 characters, and zip knows more file attributes than it used to.

replies(4): >>11394292 #>>11394426 #>>11394848 #>>11395671 #
1. anthk ◴[] No.11394292[source]
1) "" or ''

2) Now paralell is better

3) Never. You should be able to check the man pages even if you can't access the binaries.