Wednesday 19 September 2018

How The West was destroyed in four steps

1. Science proves its power and usefulness. 17th to early 19th century.

2. Science establishes itself as the Only valid form of knowledge, 'hence' mode of reasoning. Late 19th-early 20th century.

3. Bureaucracy subsidises, infiltrates, occupies, subverts, destroys science. Middle to late 20th century.

4. Bureaucracy established as the only valid form of organisation, knowledge and thinking - global totalitarian system in place.

Corresponds to: 1. Voluntary collaboration of individuals, 2. Formal institutional structure of individuals, 3. Control by committees/ peer review, 4. Individuals serve abstract processes and procedures.

Corresponds to: Scientists as Christians, Scientists brought up as Christians, Scientists as atheists, 'Scientists' as careerists.


12 comments:

William Wildblood said...

Playing devil's advocate for a moment, Bruce (because I agree with your scenario), what would you say to someone who pointed to the advances in computer and medical technology over the last few decades as examples of how science is still thriving?

lgude said...

I am not a scientist but I was on a Faculty of Education in the late 80s when I saw that step 3 had happened in my own institution. I found myself thinking as I walked across campus: "At least we are not corrupt like policeman who takes money to not do their jobs.' A few steps later the words came "Oh yes we are." In retrospect I realise that I was contemplating the 'bureaucratic infiltration, subversion and occupation 'of academic independence and integrity . In six months I had quit. Oh, I have eagerly sold my soul as often as the next man, but when I can see that what is happening is outright betrayal, even I lose my appetite. Since those days, the outlines of the 'global totalitarian system' have become clear enough that even a retired academic like me can see them.

Bruce Charlton said...

@William - There have been advances in technology in a very few areas - at the cost hugely increased expense and inefficiency. But our overall capability peaked and then declined. And the incidence of major scientific breakthroughs (rather than incrementally working out the implications of decades old breakthroughs) has declined to near zero.

This decline is, however, hidden by the truly epic expansion in hype, spin, advertisement and sheer lies; and indeed the people now in charge of so-called science lack the ability, as well as the honesty, to make a valid evaluation of their fields.

Chiu ChunLing said...

The advances in computer technology (on which nearly all apparent advances elsewhere are based) is merely a matter of improved engineering in the inherently iterative process of creating more precise tools for making integrated circuits smaller, cheaper, and faster. This leads to better instruments for gathering/analyzing basic data in many other fields.

While there have been some actual scientific breakthroughs, they are getting relatively rarer, what we now celebrate are mainly engineering benchmarks, especially in data collection and processing.

That and sheer handwavium being put forth as advanced theoretical science that has no possible technological application and thus is unverifiable.

Theramster said...

Totally agree with post and comments. This has always been my feeling, intuition and finding. There is a qualitative difference between a quantitative incremental working out of an initial discovery and a new discovery. The sheer quantity can give the impression of advancement but it is not, even if it makes the initial discovery more available quantitatively, hence usable by more people. In a sense it's replication of the same species rather than the introduction of a more advanced type.

The sad reality any such new type won't even be esteemed in the current environment but will be swamped by and because of this well established overpopulation of the old.

This points in the direction that the real solution will never arise from within the contours of the now existing institutions.

They are by definition self-satisfied anti-solution or idolized old solutions.

It may require a real Christian miracle because it can't arise and succeed spontaneously in such a hostile environment.

Bruce Charlton said...

@Igude

Thanks for that.

William Wildblood said...

Thanks Bruce. That makes perfect sense.

Seijio Arakawa said...

Advances in computer technology have been essentially stillborn. The core conceptual work (the part that requires genius) was essentially done by the 1970s. This includes both the mathematical foundations, as well as conceptual/'UI' ideas about how computers could be used by nonspecialist humans (see Douglas Englebart's *Mother of All Demos*, 1968). There have been many microbreakthroughs since then, but they have been progressively less significant and progressively less beneficial, and progressively more hyped.

E.g. 'machine learning' can do interesting things, but the immense hype prevents people from understanding what it can't do. The inventor of the neural network was recently on record stating that he thinks the neural network is a dead end that won't work for what people are trying to use it for. The only way forward would be some radically new approach discovered by some 'hungry graduate student' who is able to question every assumption made by this field and look at things in some completely new way... i.e. a genius!

... good luck finding one.

In terms of the quality of core software engineering, thinking men were already well aware that something was going terribly wrong around 1970 (see e.g. references to 'software crisis' in Dijkstra's "The Humble Programmer", 1972). Dijkstra in particular also pointed out that programming interfaces were terribly designed: while mathematical definitions are conceptually simple, software interfaces include gratuitous 'gotchas' that needlessly complicate code and make it impossible to write a program and be certain that it will always behave as intended.

Unlike other disciplines of engineering, programmers developed a culture that was not willing and not able to make effort to avoid software errors and to keep things as simple as possible. Partly, the problem was that programming is an individual craft (like carpentry or toymaking) and ought to be understood as such, but its rise coincides precisely with the "bureaucratic infiltration" period of science. Carpentry by committee! Dijkstra is understood mostly as a weird uptight kook with perfectionist demands that are incompatible with software development in the modern, ultra-fast, Internet connected 'Real World'.

Since software mirrors the organization of the people who wrote it, and software development is now managed bureaucratically, hence software development is done by piling on complexity. Hence advances in hardware capacity are eaten up by software complexity. This blog software does the same job of editing textual messages and sending it over the Internet as Usenet did in the 1970s. Yet it requires the 1970s equivalent of a major supercomputer to run.

So the 'advancement' of software development consists more than half in replacing earlier and simpler software with newer and more complicated software... that does the exact same thing!

So compared to what 'ought' to have been accomplished with computers, the current state of affairs is distinctly underwhelming.

Bruce Charlton said...

@seijio Fascinating stuff.

Chiu ChunLing said...

Well, the sad truth is that the great majority of humans simply cannot learn to program elegantly, so for there to be enough working programmers to produce software adapted to the various needs of diverse users, you cannot avoid establishing dominant paradigms that prefer methodical application of packaged solutions to craft.

Craftsmen still exist, but mass-production is the order of the day...though the analogy breaks down a bit in the middle because of the essential differences between IP and physical goods.

But yes, bloatware is consuming most of the gains made by engineering advances in available computing power.

Seijio Arakawa said...

An interesting counterexample to the current philosophy of software churn is TeX, which is often used for academic typesetting, and has been completely unchanged for the last 4 years:

“The current version of TeX is 3.14159265; it was last updated 2014-01-12.[10] The design was frozen after version 3.0, and no new feature or fundamental change will be added, so all newer versions will contain only bug fixes. Even though Donald Knuth himself has suggested a few areas in which TeX could have been improved, he indicated that he firmly believes that having an unchanged system that will produce the same output now and in the future is more important than introducing new features. For this reason, he has stated that the "absolutely final change (to be made after my death)" will be to change the version number to π, at which point all remaining bugs will become features.[11] Likewise, versions of METAFONT after 2.0 asymptotically approach e, and a similar change will be applied after Knuth's death.“

https://infogalactic.com/info/TeX

That is, once the basic functionality (figuring out where text goes on a page) has been developed, and proven itself in practical use (it works well, and in the extremely rare cases it doesn’t, everyone knows how to work around it) there is *absolutely* no reason to change things further, let alone replace it with a new version of the software that does the same thing while eating more resources.

Compare this to the Internet: the standards for typesetting technology are constantly being changed (by a huge unaccountable committee, of course) and web browsers are continually rewritten. As they’re rewritten with the latest technologies and practices they get huger and more resource hungry. You cannot browse the 2018 Internet with a 2005 browser, and you cannot fit a 2018 browser on a 2005 computer.

Chiu ChunLing said...

Even a few years difference, let alone a decade, in hardware performance makes a noticeable difference in browsing the more 'feature' laden sites these days.