Most active commenters

    ←back to thread

    146 points MaysonL | 17 comments | | HN request time: 0.506s | source | bottom
    1. linguae ◴[] No.43959492[source]
    Whenever I think about the decline of industrial research labs, not just the most famed like Bell Labs and Xerox PARC, but also the labs at companies like Microsoft, IBM, Hewlett-Packard, Digital Equipment Corporation, Sun Microsystems, Oracle, Intel, and many others, I'm reminded of a quote from Alan Kay about how those who have financially benefited from applying the results of research have often not "given back" to research:

    "It strikes me that many of the tech billionaires have already gotten their "upside" many times over from people like Engelbart and other researchers who were supported by ARPA, Parc, ONR, etc. Why would they insist on more upside, and that their money should be an "investment"? That isn't how the great inventions and fundamental technologies were created that eventually gave rise to the wealth that they tapped into after the fact.

    "It would be really worth the while of people who do want to make money -- they think in terms of millions and billions -- to understand how the trillions -- those 3 and 4 extra zeros came about that they have tapped into. And to support that process." (from https://worrydream.com/2017-12-30-alan/).

    Even before Trump's and DOGE's reckless attacks on research and academia, the software industry (I'm going to limit this to the software industry; I don't know the situation in other STEM industries such as health care, pharmaceuticals, chemicals, aerospace, etc.) has changed its strategy regarding funding research. Before the 2010s, many major companies had research labs where its employees worked on medium-term and long-term projects that may not have directly tied into current products but may form the basis of future products. If you had a computer science PhD and worked in an applied field such as systems or compilers, aside from academia and government labs, there were jobs at industrial labs where researchers could work on research systems. Sun, for example, had a lot of interesting research projects such as Self (https://en.wikipedia.org/wiki/Self_(programming_language) ; much of the work on Self influenced the design and implementation of the Java virtual machine). AltaVista, an early Web search engine that predates Google, was originally a research project at Digital Equipment Corporation (https://en.wikipedia.org/wiki/AltaVista) that was later spun off as its own company.

    However, in the 2000s and especially in the 2010s, these jobs became increasingly rare. Having worked in industrial research labs and advanced development teams during the mid-2010s and early 2020s, what I've noticed is a trend away from dedicated research labs where researchers study phenomena and perhaps build prototypes that get passed onto a production team, and more toward a model where researchers are expected to write production code. Google's 2012 paper "Google's Hybrid Approach to Research" (https://research.google/pubs/googles-hybrid-approach-to-rese...) is an excellent summary. This makes a lot of sense under the context of early Google; Google in the 2000s needed to build large-scale distributed systems to power Google's search engine and other operations, but there was little experience within and outside the company on working on such Web-scale systems. Thus, Google hired CS PhDs with research experience in distributed systems and related topics, and then put them to work implementing systems such as MapReduce, BigTable, Spanner, and many others. I see a similar mindset when it comes to AI companies such as OpenAI, where researchers directly work on production systems.

    Researchers working directly on products that take advantage of research is an effective approach in many situations and it's brought us many innovations, especially in Web-scale systems, big data processing, and machine learning. However, not all research has obvious, direct productization opportunities. For one, not all computer science research is systems-based. There is theoretical computer science research, where researchers are exploring questions that may not immediately lead to new products, but may answer important questions regarding computing. Next, even in systems research, there are areas of research that could be productized a few decades down the road, but in order for those products to be created, the research needs to be done first. Deep neural networks took off once hardware became cheap enough to make DNN architectures feasible, for example. However, without the work done on neural networks in the decades prior to affordable GPUs, research on DNNs would be further behind compared to today.

    The biggest problem that I see with attitudes regarding research funding, not just in industry, but also in academia and government, is that funders don't appreciate the fact that research is inherently risky; not all research projects are going to lead to positive results, and the lack of positive results is not a matter of a researcher's work ethic or competence. Funders seem to want sure bets; they seem to only be interested in funding research that has a very high ROI likelihood.

    Yes, funders should have the freedom to fund the projects and researchers that they want. There are obvious reasons why funders are more interested in hot topics such as large language models and blockchain applications versus topics where there is less of an obvious likelihood for short-term ROI. However, I feel that it is important to fund less obviously lucrative research efforts. I feel industry is not interested these days in making more speculative bets, kind of like the research projects that Xerox PARC did back in the 1970s.

    Academia seems like a natural home for more speculative research. Unfortunately academia has two major pressures that undermine this: (1) the "publish-or-perish" culture found at many major research universities, and (2) fundraising pressures. These two factors, in my opinion, encourage academics, especially pre-tenure and non-tenured ones, to optimize their research pursuits for "sure bets" instead of riskier but potentially higher impact work. The fundraising pressures have gotten much worse now with the abrupt cuts to research funding in the United States.

    A long-term solution to this problem requires cultivating a culture that is more understanding of the research process, that research is inherently risky, and that different types of research require different funding mechanisms. I'm all in favor of Google- and OpenAI-style research projects where researchers are directly involved with product-building efforts, but I'm also in favor of other styles of research that are not directly tied to product-building. I also want to see a culture where large corporations and wealthy individuals donate meaningful amounts of money to fund research efforts.

    It would be a major setback for society for us to return to the pre-1940s days of "gentlemen scientists" where science and other academic pursuits were only reserved for the independently wealthy and for those who relied on patronage. Modern technological innovations are made possible through research, and it's important that research efforts are funded in a regular manner.

    replies(7): >>43959648 #>>43959838 #>>43959841 #>>43960769 #>>43961274 #>>43962814 #>>43965611 #
    2. kianN ◴[] No.43959648[source]
    This really hits the nail on the head. I think the extension to this is the all eggs in one (the first) basket approach being exacerbated: it’s always been challenging to conduct foundational research in a field where there already exists a dominant paradigm in industry, but it’s also becoming more and more challenging in academia as well.
    3. lwo32k ◴[] No.43959838[source]
    What culture you cultivate is just one variable. Go to Japan or Iran or Italy and they will fall over each other, telling you how great their culture was and what it cultivated. When things are relatively stable, it's not hard to lean on the Explore side of the Explore-Exploit tradeoff. When things are ever changing, and at faster and unpredictable rates the tradeoff naturally gets much more complex.

    You have to ride the waves. No one really controls them.

    4. robocat ◴[] No.43959841[source]
    If one altruistically decides to give to society then I'm unsure what outcomes one should expect from society? What's the game-theory here? If one wants something back then who is responsible to get something back?

    My canonical example is Linus: if the world was fair he should be worth some pretty big numbers. It is harder to pick what is fair in return for someone's scientific innovation. I would guess Linus has generated as much worth as Microsoft, so in an ideal world he should be worth a Bill Gates. Linus has mostly chosen other non-financial goals to chase (unlike Bill). Linus $50 million, Bill Gates $156 billion.

      “While I may not get any money from Linux, I get a huge personal satisfaction from having written something that people really enjoy using.”
    
      “The cyberspace earnings I get from Linux come in the format of having a network of people that know me and trust me, and that I can depend on in return.”
    
    Some people that complain about the wealthy making money are selfishly obsessed about money themselves. Perhaps even hypocritically denegrating others as too-money-focused when others choose to win the money making game.

    Unfortunately our world tends to be very focused on financial gains; and often completely ignoring non-financial benefits.

    What are the non-financial benefits of an iPhone compared to the money paid for it?

    How much is job-enjoyment or job-status worth; compared against either the money earned or the time spent?

    replies(2): >>43960021 #>>43962579 #
    5. realityking ◴[] No.43960021[source]
    > I would guess Linus has generated as much worth as Microsoft

    Microsoft makes everything from game consoles to ERP systems while Linus created one part of an operating system and a source control system. Linus certainly captured less of the economic value he created than Microsoft’s founders shareholders but Microsoft has generated much, much more (economic) value.

    replies(2): >>43960948 #>>43964435 #
    6. fallingknife ◴[] No.43960769[source]
    > It would be a major setback for society for us to return to the pre-1940s days of "gentlemen scientists" where science and other academic pursuits were only reserved for the independently wealthy and for those who relied on patronage. Modern technological innovations are made possible through research, and it's important that research efforts are funded in a regular manner.

    How much of a setback would this really be, though? The US government spent $200 billion of R&D in 2024. Of that $200 billion, $140 billion of it was military related which is probably not in danger. Including private spending, total r&d was around $900 billion. So even if the $60 billion of non-military government research spending was entirely eliminated, that would be only around a 7% decline in spending.

    But still that would be bad. That $60 billion is a lot to replace from private funding. I don't know if it would be a net negative, though. It seems like privately funded research can seriously outperform government funding because when private actors fund research, they do it for the specific reason that they want the research done. Contrast this with public funding which has to meet a lot of political goals that have nothing to do with science. Look what SpaceX has been able to accomplish vs NASA. NASA has great scientists and engineers and a much bigger budget than SpaceX did, but the problem is that their rocket building program was more of a jobs program and a way to spread money to a lot of congressional districts than it was about building rockets. Whereas SpaceX had exactly one goal and that was to build rockets.

    I also think that it's important to fund research in a regular manner, but is government a more reliable way than private patrons? It feels like its the opposite. If we had a government that really cared about science and committed to funding it in an effective way and never using it as a cover to funnel money to political causes I do think it would be better than relying on private funding, but when I read a list of cancelled grant like this it doesn't seem like we do. https://airtable.com/appGKlSVeXniQZkFC/shrFxbl1YTqb3AyOO?jnt...

    replies(2): >>43961229 #>>43961279 #
    7. robocat ◴[] No.43960948{3}[source]
    Well, I'll pick a similar counterexample then.

    Microsoft valued Github at over $7.5 billion in stock at acquisition.

    git was developed by Linus, subsequently generating economic value for Microsoft.

    What was git worth economically? Did Microsoft pay Linus for any of the value it received?

    replies(2): >>43961199 #>>43961910 #
    8. firesteelrain ◴[] No.43961199{4}[source]
    Git is foundational. GitHub added value by building collaboration tools, UIs, CI/CD integration, and a social coding layer on top of Git. Git is licensed under the GNU General Public License (GPL v2), which allows anyone to use, modify, and distribute it freely under its terms.

    It’s an infrastructure that others build on.

    9. firesteelrain ◴[] No.43961229[source]
    SpaceX has benefited greatly from public spending - they just had a different operating model. NASA moved to the commercial space program. SpaceX has had many military and government contracts to sustain themselves. Texas offered subsidies for Boca Chica.

    SpaceX has received substantial public investment in the form of contracts, infrastructure, and incentives so in essence SpaceX is benefiting the same way that you seem to abhor

    10. potato3732842 ◴[] No.43961274[source]
    >However, in the 2000s and especially in the 2010s, these jobs became increasingly rare.

    These jobs became rare because academia took them. The pie may have grown or shrunk a bit but mostly what happened is that instead of running this stuff in house BigCo will sponsor or parter with some university lab or research program.

    I haven't crunched the numbers but I suspect that they get better write offs this way.

    11. jltsiren ◴[] No.43961279[source]
    The are two fundamentally different kinds of private research funding. (Let's drop the &D part, because that's mostly unrelated to the kind of research we are talking about.)

    Charitable foundations and similar organizations are not that different from government agencies funding research. They act on a smaller scale, because rich people are not actually that rich.

    Then there are companies that do research as part of their business. They are typically much better funded and much narrower in scope than government-funded research. They are also biased towards topics that can be reasonably expected to work and produce economic value within the next 10-20 years. This kind of research is inherently inefficient due to redundant efforts. Instead of making their findings public, companies often keep the results secret, forcing their competitors to waste money on reinventing the wheel.

    12. maratc ◴[] No.43961910{4}[source]
    There is tremendous value in git, however you need to take into consideration the fact that 100% of the people and companies who pay GitHub (and thus, generate value for Microsoft) have the option of just using git without GitHub, for $0.

    This causes me to think that what these people are actually paying for is GitHub, and not git.

    Which causes me to think that, when Microsoft valued Github at over $7.5 billion, that value reflected the value of GitHub sans git.

    Linus is in business of advancing the humanity, without making billions in the process. I can only salute him for that.

    13. raddan ◴[] No.43962579[source]
    > If one altruistically decides to give to society then I'm unsure what outcomes one should expect from society? What's the game-theory here?

    I’m not sure if you’re asking rhetorically, but answering your question literally speaking: none. If you’re giving altruistically then you should expect nothing back.

    As you point out, many projects whose goal is more personal satisfaction than money-making do indeed generate tremendous value, and in some cases, lots of money too. It’s hard to quantify the value of open source but I would not be surprised if its value exceeded the value of commercial, closed source software.

    14. mnky9800n ◴[] No.43962814[source]
    I think to put what you are saying a different way, we have optimized the system to reward ambition at the expense of creativity and curiousity. This rewards careerist scientists over curious ones because the careerist scientist makes all of the safe bets, exploring things that are well known or alternatively exploring things that they think are "hot" and thus will attract funding and citations. I have been in more than one meeting where someone says things like, "uhh, how can we stick digital twins, machine learning, and this lab apparatus i have together? that will be a compelling application." There is no research questions in that statement, no curiousity about the universe, only ambition to attract funding. I don't think ambition is inherently bad, it is a useful trait to have as a scientist. But we have optimized the system to the point where being ambitiously curious does not seem to be rewarded.

    I think what you have written here is very well connected to how academia currently functions and disfunctions. Thanks.

    replies(1): >>43968591 #
    15. Miraste ◴[] No.43964435{3}[source]
    I'm not sure about that. Linux runs effectively the entire internet, as well as the majority of personal computing devices (there are more Android phones out there than Windows PCs and iPhones combined), and everything from rockets to toasters. That's a lot of economic value.
    16. derf_ ◴[] No.43965611[source]
    I once asked someone at a large Bay-area tech company why they did not invest more in fundamental R&D, and they told me, "We just buy the winners. 'All of Silicon Valley is our research lab'" (the last being, I believe, a quote from their CEO). You don't even need to do it with your own money. Just use debt and equity.

    The economist Ha-joon Chang likes to quote the statistic than in the 1950's, companies reinvested 65% of their profits back into growing the business, and today that number stands at 5%. The stock market overall returns more cash to shareholders through buybacks and dividends than it raises from investors by selling equity. It is no longer primarily a funding mechanism for companies. It's how investors cash out.

    I don't even think it is about making Ph.D's write production code: I support that, as it makes them much more realistic about what is technically feasible (I say as a Ph.D who was written a lot of production code, and has had to work with Ph.D's who haven't). It is that whatever you are doing must be able to be tied directly to near-term revenue. Even just cost-savings is rarely attributed back to the people who created it, because no one wants to give you credit for money they didn't spend. And certainly no one wants to fund something that will not generate a tangible return for 5+ years, or that might not even succeed.

    At one point I was asked, as a research Ph.D focused on very low-level software, to come up with subscription-revenue generating product ideas, because "subscription revenue" was the latest buzzword among upper management. They wanted me to build the next TikTok for them. And all I could think was, even if I had the product design talents to be able to do that, in a ZIRP era, why would I build it for you? We are not in that era any longer, but it lasted a long time, and its passing has only made people more near-term focused.

    17. tough ◴[] No.43968591[source]
    Sadly the ones getting the funding are the ones paying the salaries and giving the job roles out so they feel inherently more legitimated to churn the revenue into the money making machine

    Never be a cost center