flatpepsi17 13 hours ago

Article starts mentioning 4GL's - a term I have not heard in a long, long time.

COBOL's promise was that it was human-like text, so we wouldn't need programmers anymore. A lot like "low code" platforms, and now LLM generated code.

The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution. When you get down to breaking down that problem... you become a programmer.

The main lesson of COBOL is that it isn't the computer interface/language that necessitates a programmer.

  • vishnugupta 3 hours ago

    I agree with you by and large except for this part.

    > COBOL's promise was ... we wouldn't need programmers anymore..average person doesn't know how to explain & solve a problem

    COBOL wasn't intended to be used by an "average" person but rather those with deep domain knowledge. They would know the business processes so well that they could transcribe it in COBOL with little or no need to learn how the computers worked. In some ways similar to analysts/data folks using SQL to communicate with databases.

    While at it let me share a few more aspects of the top of my head.

    COBOL and 4GLs in general were primarily intended to be used to build business applications; payroll, banking, HRMS, inventory management and so on. Even within that emphasis was more towards batch processing operations to reduce the burden on people doing routine bulk operations like reconciliation.

    COBOL harks back to the times when there was no dedicated DBMS software. Which is why you see so much focus on how files are organised and the extensive verbs around files which somewhat resemble SQL today.

    • moomin 2 hours ago

      In my experience, often it’s hard to find that person with deep domain knowledge, and even when they do, it’s unstructured, they take things for granted they shouldn’t* and the have no appreciation of the demands of formalism.

      Getting anything you can use to construct a work plan, never mind a detailed feature list, out of clients can be a dark art.

      *To the point I have repeatedly experienced a point close to the end of the project where they go “What do you mean you don’t handle a case I have failed to mention for the entire duration of the project?”

      • dcminter an hour ago

        I recall a spec doc from a domain expert that said something like:

        "The transaction consists of a credit stub and a debit stub. If the debit stub is missing and is of type X then we do A and if it is of type Y then we do B."

        How to know what flavour the missing item was? Absolutely no mention of that...

    • tannhaeuser 2 hours ago

      > COBOL and 4GLs in general

      COBOL dates back to 1959, much earlier than 4GLs, and the cited 1992/1999 articles make the point that 4GLs were poised to replace the likes of COBOL and FORTRAN when in fact those dinosaurs, or rather nautili since still living, turned out to outlive 4GLs except SQL (when counted as 4GL).

  • froh 4 hours ago

    > The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution.

    I intuit this also is an intrinsic limit to LLM based approaches to "you don't need them expensive programmers no more"

    with LLMs magically "generating the solution" you move the responsibility for concise expression of the problem up the ladder.

    and then you "program" in prompts, reviewing the LLM-proposed formalization ("code").

    I other words, the nature of "programming" changes to prompt engineering. alas you still have to understand formal languages (code)...

    so there'll always be plenty to do for humans who can "math" :-)

    • devjab 2 hours ago

      This is only true to an extend. We have a lot of digitally inclined workers who’re developing programs or scripts to handle a lot of things for them. It’s imperfect and often wildly insecure and inefficient, but unlike any previous no-code or “standard” solution it actually works. Often in conjunction with “standard” solutions.

      On one hand you’re correct in that there will always be a need for programmers. I really doubt there will be a great need for generalist programmers though. The one area that may survive is the people who’re capable of transforming business needs and rules into code. Which requires a social and analytical skillset for cooperating with non tech people. You’ll also see a demand for skilled programmers at scale and for embedded programming, but the giant work force of generalist developers (and probably web developers once Figma and similar lets designers generate better code) is likely going to become much smaller in the coming decades.

      Then is basically what the entire office workforce is facing. AI believers have been saying AI would do to the office what robots did to the assembly line for years, but now it actually seems like they’re going to be correct.

      • stroupwaffle 32 minutes ago

        Another parallel is type foundries and printing presses. At one point people operated these linotype machines which used molten lead. Of course this transitioned to photo typesetting which, to the dismay of everyone had poor results. Along came Donald Knuth and TeX to fix those deficiencies. NOTE: mechanical printing has a profoundly better result no matter what. It is the ink and the impression in paper that makes it superior (for letterforms and such).

        So, if AI follows suit, we will witness the dumb (but very knowledgeable) AI start to supplant workers with questionable results; and then someone (or a team) will make a discovery to take it to the limit and it’ll be game over for large swaths of jobs.

        https://en.m.wikipedia.org/wiki/Hot_metal_typesetting

    • jasfi 4 hours ago

      A lot of business people want to get something functional that they can sell, and hire a programmer if/when they can afford one. That niche is seeing a lot of uptake with regards to LLM based approaches.

      This works for them because an MVP typically isn't a lot of code for what they need, and LLMs have a limited scope within which they can generate something that works.

  • unscaled 6 hours ago

    4GL were supposed to be even more of that, with more "human-language-like" constructs added to the language to deal with things besides general logic, simple data structures and arithmetic.

    The author mentions "4GLs" were all the rage in the early 1990s, but I doubt that that was true outside of the mainframe world. The 4GL movement, as a conscious movement, seems to have always been highly mainframe oriented (the Wikipedia article mentions reducing the amount of punched cards necessary for a program as initial goals). By the 1990s you could categorize many languages as 4GL, but I doubt this term was used with any enthusiasm outside of the mainframe world. It was the opposite of a buzzword.

    1992 wasn't too long ago. Linus Torvalds has already released Linux, and Guido van Rossum was already working on Python. Perl was already gaining popularity, and Haskell also saw it first versions released. The forefront of technology was already shifting from expensive workstations to consumer-grade PCs and language designers gave little thought to 4GL concepts, even when they happened to design something that could qualify as a 4GL for personal computers (e.g. dBase, HyperTalk, AppleScript).

    I agree that human-like text is a bad idea for most use cases of programming, but I think this is not why the 4GL movement failed, and in fact most 4GLs weren't more "natural language-like" than the 3GL COBOL. I think the main problem was that the 4GL movement has never really defined a new generation or anything useful at all. The previously defined generations of language introduced revolutionary changes: translation from friendlier assembly language to machine code (2GL) and compilation (3GL). The only change we can properly define from the loose definition of 4GL is "put more features that used to be external routines or library directly into the language".

    This approach worked out relatively well when the language was domain-specific. This is how we got some of the most successful 4GLs like SQL, R and MATLAB. These languages have syntax that deals directly with data tables, statistics and linear algebra directly into the language and became very useful in their own niche. The concept of a general-purpose 4GL, on the other hand, was always destined to boil down to an overly bloated language.

    • int_19h 2 hours ago

      dBase and its numerous descendants and competitors (FoxPro, Clipper etc) were extremely popular for line-of-business desktop applications in the 90s. And, yes, they are indeed traditionally categorized as 4GLs - and, given how nebulous the definition always has been anyway, I think that "traditionally categorized" is the most practical definition that you can use here.

      But, yes, I agree that aside from the generally more verbose and sometimes unwieldy syntax, there wasn't really that much to it in practice. I did work with FoxPro, and the reason why it was popular was not because you had to write things like "ACTIVATE WINDOW", but because it had many things baked directly into the language that nicely covered all the common tasks a pre-SQL data-centric app would need - e.g. a loop that could iterate directly over a table.

  • AdieuToLogic 11 hours ago

    > COBOL's promise was that it was human-like text, so we wouldn't need programmers anymore. A lot like "low code" platforms, and now LLM generated code.

    The more things change, the more they are the same.

  • bloppe 6 hours ago

    Even LLMs have not realized the dream of a natural language computer interface. Everyone who uses them significantly has to read up on prompt engineering and add little things like "explain your steps" or "describe it like I'm 5" or other oddly specific sequences of characters to get the results they want. That's not natural language. It's a DSL.

    • rbanffy 3 hours ago

      Worse. It’s a DSL without a formal specification. You are writing prompts blindly in hopes they trigger the desired behaviour from the LLM.

      A bit like writing enchantments to force demons to do your bidding.

  • UniverseHacker 13 hours ago

    > we wouldn't need programmers anymore

    This blows my mind, since it seems like a fairly low level/terse language compared to more modern domain specific languages.

    But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper, and have it work - something few people can or need to do nowadays

    • thorin 19 minutes ago

      Saying we don't need "programmers" any more was true when a programmer was someone who used very low level languages such as Assembly and probably had used punched cards in the past etc. Languages like cobol / fortran / plsql gave analysts a chance of designing things on paper and handing off to developers or even doing the development themselves which couldn't have happened in the past. Using something like python these days feels like the kind of thing that would have been thought of as a 4gl in those days for some use cases. However, python also works as a general purpose programming language.

    • AdieuToLogic 11 hours ago

      > This blows my mind, since it seems like a fairly low level/terse language compared to more modern domain specific languages.

      I have heard others and myself describe COBOL in many ways, most involving creative expletive phraseology which would make a sailor blush, but "low level/terse language" is a new one to me.

      > But in some sense they were dead right... since (I assume) that what "programming" meant at the time was being able to write raw machine code by hand on paper ...

      LISP and Fortran predate COBOL IIRC.

      • andsoitis 3 hours ago

        > LISP and Fortran predate COBOL IIRC.

        Correct. Fortran, LISP, and COBOL were invented in ‘57, ‘58, and ‘59, respectively.

        • jll29 2 hours ago

          > Yes, but the ideas behind COBOL were older still. Flowmatic, COBOL’s predecessor, dates back to 1955, so it really just depends how you count.

          Yes. but the ideas behind LISP were older still: Church's typed lambda calulus was conceived in 1936.

        • moomin 2 hours ago

          Yes, but the ideas behind COBOL were older still. Flowmatic, COBOL’s predecessor, dates back to 1955, so it really just depends how you count.

    • electroly 10 hours ago

      Do you mean something other than "terse" here? Or are you perhaps thinking of a different language? I cannot possibly imagine that adjective being used to describe COBOL. It is the #1 textbook example of a verbose language--the opposite of terse.

      • UniverseHacker 10 hours ago

        What I mean is that it is an attempt to make a high level domain specific language, but is not my modern standards

  • wvenable 7 hours ago

    My company is finally upgrading away from a product that is written in a 4GL language. This product probably started out on a Unix but was ported to Windows decades ago. It has both a web and classic VB front ends.

    All the source code is available and theoretically I could make changes and compile it up. The language itself is basically just plain procedural code but with SQL mixed right in -- somewhat like DBase or Foxpro but worse. I think the compiler produces C code and is then compiled with C compiler but it's been a while since I looked into it. Requires a version of Kornshell for Windows as well.

  • actionfromafar 12 hours ago

    Vision 4GL. Like VB but cross platform and with a horribly unstable IDE which would corrupt the source code. (Which was in some kind of binary format not amenable to source control.)

norir 14 hours ago

I think of scala in this context. I think that scala is basically dead at this point in the way that COBOL was framed in the article. Yes, there are still many businesses/services that have critical components written in scala but current mindshare has cratered for new projects. I only single out scala because I have spent a lot of time with it and have seen it go through the hype cycle (in 2012-14 it seemed like I was constantly seeing doing $X in scala pieces on HN and I almost never see it referenced here anymore). It's probably a natural and inevitable phenomenon (and a bit of a shame because scala did get some things right that other mainstream languages still have not).

  • guessmyname 11 hours ago

    I know a couple of engineering teams at Apple that are working on new projects in Scala, while also maintaining some legacy systems. Some of these projects are quite critical to the company’s ecosystem, e.g. test systems. I’ve spoken with several engineers who helped create these systems years ago; they’re all now in senior management positions. Some still stand by the technology choices made back then, while others are more open when given a chance to reflect. The general consensus is that if Kotlin had been available at the time, or if Swift had been a viable option for back-end services, they definitely wouldn’t have chosen Scala for those projects.

    • andsoitis 3 hours ago

      > The general consensus is that if Kotlin had been available at the time, or if Swift had been a viable option for back-end services, they definitely wouldn’t have chosen Scala for those projects.

      But they were not.

    • emmelaich 9 hours ago

      Surprised they don't use Swift. Or is that too unstable? Or is there a on-JVM requirement?

      • worthless-trash 7 hours ago

        My money is that they started these projects before swift was available on linux.

        I have no evidence to say that apple use Linux, but businesses gotta business so isnt a big bet to make.

        • swiftcoder 2 hours ago

          Even if the servers run MacOS, swift wasn't really being aimed at backend usecases for the first few years of its existence....

        • mozman 7 hours ago

          Apple is BSD based - not Linux.

          • scottlamb 7 hours ago

            Apple is a company, not an operating system. The parent is almost certainly aware macOS is BSD-based and is suggesting Apple also uses Linux in e.g. cloud deployments. They are of course correct.

  • bad_user 8 hours ago

    Scala is very much alive and kicking.

    https://redmonk.com/sogrady/2024/09/12/language-rankings-6-2...

    The initial hype has died off and that's OK. The hype cycle is inevitable for all languages. Also, predictions rarely happen, mostly because the landscape has changed. Mainstream programming languages can no longer die like Cobol did.

    E.g., Java has been dying ever since 2001, surviving the dotcom bubble, .NET, the P in LAMP, Ruby, JS, or Go. Python was supposed to die on its version 3 migration, with people supposedly moving to Ruby.

    FWIW, Scala is the world's most popular FP language, it has good tooling, and libraries, and Scala 3 is a wonderful upgrade.

    • norir 5 hours ago

      I spent 3 years working on scala tooling in my free time. One of my libraries is used by the vast majority of scala users (it is a dependency of other widely used tools). There was growth from 2018-2023 but it has flatlined over the last year. Right when theoretically it should be getting the boost from scala 3 having now been stable for a bit.

      Personally I feel that scala has too much in the language and the compiler is too slow. The tooling is pretty good but it is finicky and ends up getting slow and/or unreliable with larger projects. Even if I were to restrict myself to a small subset of scala, I would still be unsatisfied with the long compile times which was the primary reason I decided to move on.

      I don't know if I agree with your contention that languages can't die like COBOL. I think you can relatively easily keep a legacy scala system up, put it in maintenance mode and write new features/products in something else. That is what I expect is already happening with scala and that this trend is likely to accelerate. Keep in mind also that Martin Odersky is nearing retirement age and it's really hard to imagine scala without him. He has much more power/control than the head of most languages.

      • bad_user 4 hours ago

        IMO, there's rarely such a thing as maintenance mode. Projects constantly evolve, and in turn this drives more investment in tooling and the ecosystem needed to keep those projects up. And this investment is what eventually drives more new projects and fresh blood, keeping the language fresh and in demand.

        Again, look at Java.

        Ofc, there's always the question of what happens with a market that isn't constantly growing due to zero-interest rates phenomenon. I guess we'll see, but IMO, that's problematic for newer languages, not established ones.

        I too am a contributor of very popular libraries and am very familiar with ecosystem. One thing to keep in mind is that the language's culture has evolved. When I picked up Scala, back in 2010, the Future pattern and Future-driven libraries were all the rage. Whereas nowadays people prefer alternatives which now includes blocking I/O (Loom), with Future-driven libs being a risk going forward.

        • int_19h 2 hours ago

          I don't think many people would describe Java as "fresh" these days. In demand, sure, but this is overwhelmingly driven by existing large enterprise codebases. Also, for all the talk about nifty new features, how much stuff is still on v11 even?

          • baud147258 33 minutes ago

            > how much stuff is still on v11 even?

            We've had a potential client ask for a PoC in Java 8, to integrate with their current system... But yeah, our product is deployed with Java 11 and since some dependencies have issues with 18, we'll likely stay that way for a few more years

          • bad_user an hour ago

            I understand from where your reply is coming from, but again, I was reading the same opinions about Java since more than 2 decades ago.

            > overwhelmingly driven by existing large enterprise codebases

            That happens with all mainstream languages, but it's a feedback cycle. The more popular a language is (in large enterprise codebases), the more it will get used in new projects, for obvious reasons. People want to get shit done and to have good ROI and maintenance costs. Therefore, the availability of documentation, tooling, libraries, and developers helps, in large and small projects alike.

            And yes, Java is quite fresh, IMO.

          • surgical_fire an hour ago

            I've been a Java developer for nearly 2 decades, in multi companies, despite being proficient with other languages. Java just happened to pay better.

            Nearly all companies I worked for were developing new systems, tools, etc. Rarely I was doing maintenance on "existing larger enterprise systems".

  • bigger_cheese 9 hours ago

    I think Perl today is probably closer to COBOL it was massive for a time, felt like it was everywhere.

    Nowadays it is increasingly niche. Like COBOL there is still a lot of perl code out in the wild.

    • bigiain 5 hours ago

      Perl footgunned itself with the Perl5/Perl6/Raku and almost 2 decades between major releases debacle.

      I wrote a _lot_ of Perl, starting with Perl4 cgi scripts in the mid 90s, then Perl5 and FastCGI and Apache ModPerl. I loved it as a language. But by the time I left that gig in 2008, nobody wanted Perl any more. I mostly drifted around PHP, Python, Ruby, and Javascript for a few years until moving away from full time coding and up (sideways?) into leadership and mentoring roles.

      Interestingly I got _into_ the Perl gig when I bailed on a COBOL maintenance gig where it was clear nobody was at all interested in listening to how their 10+ year old custom COBOL warehouse management app (written by the company the boss's sister used to own) running on EOLed Wang minicomputers - was completely incapable of dealing with 4 digit dates for Y2K. I jumped ship to make that somebody else's problem.

    • enriquto 5 hours ago

      > Nowadays it is increasingly niche.

      Still, if you buy a brand new mac today, most of the executable scripts in the system are written in perl.

      You can check it yourself by running:

          file -bL /bin/* /usr/bin/* | cut -d' ' -f1 | sort | uniq -c | sort -n
      
      As of 2024, macOS is essentially a Perl operation.
      • rightbyte 40 minutes ago

        The bad thing with using a proper language like Perl for admin scripts is that they will degenerate into programs.

        The good thing with Bash etc is that they are so bad you wont and when you do it anyway atleast you get some whip lashes for it.

    • xarope 7 hours ago

      time to brush up my perl. Requires some zen'ess and flow time to grok the #@[]{} again...

      • kevindamm 6 hours ago

        It's been over a decade for me but I remember the # for tables and @ for arrays not being that hard to decipher, it was the "where is this $_ referring to at this point?" kind of puzzles that would stump me, especially when pattern matching application implicitly uses it too.

  • darksaints 9 hours ago

    It's a shame too. Scala3 is actually an amazing language, and has the best type system out of all of the common functional languages. Part of me wonders if Scala would still have died off if Scala3 came out first.

  • mcv 11 hours ago

    Scala can never be dead like COBOL because it has never been alive like Google. I love it too, but Scala has always been fringe. COBOL was everywhere.

  • alfalfasprout 10 hours ago

    What about spark? Given the incredible adoption of spark across the industry I don't see scala going away anytime soon.

    • SOLAR_FIELDS 7 hours ago

      Probably PySpark and similar connectors are robust enough now that they are not necessarily joined at the hip like they were 10 years ago. If you were working in Spark at the maximal of its hype cycle around then you basically had to use Scala in at least some extent - even if it was simply a core team exposing native API’s in other languages - since it was the most native approach that exposed all the apis you needed. Nowadays probably other languages and wrappers have caught up enough that using Scala is not such the absolute requirement it was before.

      • tdeck 5 hours ago

        This is very true in my experience. I worked in Spark for 3 years and never touched Scala code. I imagine there are many people using Spark who don't even know it's written in Scala, or whose only interaction with Scala is accidentally stumbling on Scala Spark documentation when you were meaning to Google for PySpark.

  • 7thaccount 12 hours ago

    I assume it became less popular when Java became more bearable.

    • n_plus_1_acc 12 hours ago

      And kotlin came around with great IDE support, and with good features without the complexity of scals

  • Lance_ET_Compte 9 hours ago

    Scala is the basis for Chisel HDL, which is widely used in the RISC-V design community.

  • jackcviers3 9 hours ago

    We use it for all new services at Writer. Jack Henry, SiriusXM, Starbucks, Disney streaming services, and Capitol One all have services (not data-science) divisions producing new projects in Scala ranging from the last five years to today.

    There are many others, of course, bit those are the teams at places people have heard of off of the top of my head. It's far from dead.

    What does seem to be dying are the framework-centric Play Akka, and non Airflow raw Spark jobs out there. Now, a lot of that is because they were framework jobs that happened to originate in the scala ecosystem - scala was largely incidental and was chosen because of founding project members' preferences or due to the need to develop a commercial market, imho.

    • BirAdam 8 hours ago

      That’s precisely why people think it died. It became stable and therefore useful. It is therefore not talked about every 3 seconds by startup founders.

    • mozman 7 hours ago

      As a fellow vendor for one of those names you dropped - I recommend you not to name any companies.

      • wholinator2 6 hours ago

        Why? Googling every name given returns public job postings for (senior) scala engineers. Presumably scala divisions at these companies are public knowledge?

  • mhh__ 9 hours ago

    Scala3 looks fairly interesting.

    The problem however is that I can't be bothered to roll out a JDK, and secondly if I did it might encourage someone else to start writing Java again internally. Risky payoff...

tombert 14 hours ago

You know, one of these days I really need to sit down and play with some of these "legacy" languages, like Fortran or COBOL or Ada or APL; languages that have certainly fallen out of popularity but are still used in some critical places.

It does make me wonder about millions and millions of lines of Java out there; Java has more or less eaten the enterprise space (for better or worse), but is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

  • fastneutron 18 minutes ago

    Fortran is alive and well in science and engineering. The more modern standards are much nicer to work with, but largely backwards compatible with stuff written 50 years ago.

  • Muromec 13 hours ago

    Cobol is still there not because of cobol itself, but because of vendor and platform lock-in. And I guess having monolithic codebase/platform.

    it’s not even esoteric and difficult, just a lot of it without much structure visible to you.

    • danielmarkbruce 13 hours ago

      This is what people miss about COBOL. It's not like people are compiling COBOL and running it on Linux on an x86 box. They are running it on legacy operating systems (and hardware) which provide a different set of underlying services. It's a whole different planet.

      • crackez 12 hours ago

        Negativo friendo.

        The mainframe is turning into a middleware layer running on Enterprise Linux. We've containerized the mainframe at this point, and I mean that directly - eg. Running jcl, multiple CICS regions, all in COBOL that originated on z/OS is now running in k8s on amd64.

        • kjellsbells 10 hours ago

          I hope you're right, but many comments here on HN suggest their experience with mainframes is very different. z/OS and its predecessors provided so many services completely transparently to the application that a mainframe to modernity migration is doomed to fail unless it can completely emulate (or design around) the capabilities provided by the OS and other subsystems.

          Even ignoring the needs of the super high end customers like banks (eg, cpus in lockstep for redundancy), being able to write your app and just know that inter-node message passing is guaranteed, storage I/O calls are guaranteed, failover and transaction processing is guaranteed, just raises the bar for any contender.

          K8s is wonderful. Can it make all the above happen? Well, yes, given effort. If I'm the CTO of an airline, do I want to shell out money to make it happen, risk it blowing up in my face, or should I just pay IBM to keep the lights on, kick the can down the road, and divert precious capital to something with a more obvious ROI? I think their "no disasters on my watch/self preservation" instinct kicks in, and I can't really blame them.

          HN thread:

          https://news.ycombinator.com/item?id=36846195

          • Spooky23 7 hours ago

            Like anything else, some places are awesome, some not. I’ve seen both. The worst ones are just like modern places with overcustomized PeopleSoft or SAP - except the blobs of off the shelf software were purchased 30 years ago by people long dead.

            Other places stopped development 20 years ago and surrounded the mainframe with now legacy middleware. A lot of the “COBOL” problems with unemployment systems during COVID were actually legacy Java crap from the early 2000s that sat between the mainframe and users.

          • Muromec 3 hours ago

            >If I'm the CTO of an airline, do I want to shell out money to make it happen, risk it blowing up in my face, or should I just pay IBM to keep the lights on

            But that's the thing, we are at the point when "keep paying IBM" isn't the acceptable answer anymore.

        • accra4rx 8 hours ago

          [I work as a SA] . There are many companies that don't have a original COBOL source code only compiled objects which has been running for more than few decades. How can you guarantee that it will run perfectly in k8s . Major companies can never take that risk unless you give them some insurance against failure

        • Muromec 11 hours ago

          There is a major drawback to this approach -- you need to have somebody who knows what they are doing. Total deal breaker in most of the places that have this problem in the first place.

          • gerdesj 11 hours ago

            "you need to have somebody who knows what they are doing"

            That applies everywhere.

            Your parent comment has managed to stuff a mainframe in a container and suddenly, hardware is no longer an issue. COBOL is well documented too so all good and so too will be the OS they are emulating. I used to look after a System 36 and I remember a creaking book shelf.

            The code base may have some issues but it will be well battle tested due to age. Its COBOL so it is legible and understandable, even by the cool kids.

            If you lack the skills to engage with something then, yes, there will be snags. If you are prepared to read specs, manuals and have some reasonable programing aptitude and so on then you will be golden. No need for geniuses, just conscientious hard workers.

            It's not rocket science.

            • Muromec 3 hours ago

              It's not the point I'm trying to make. Yes you can do fancy stuff like that and de-mainframing COBOL is to run in on k8s is the path I would personally choose if I had to deal with it. It sounds like a lot of fun and the sense of accomplishment to finally have it running should be great.

              The problem is -- it's very smart and unique, while organizations that have this kind of a problem don't want to depend on unique set of skills of a few highly capable individuals. Everything needs to be boring and people have to be replaceable.

              In this paradigm, vendor java with aws lock-in is a cost, but in-house fancy stuff with cobol on k4s done by smart people in house is worse -- it's a risk.

            • SonOfLilit 3 hours ago

              The need applies everywhere, the difficulty of fulfilling it tends to be an order of magnitude more in places that tend to run COBOL.

              I'm working at one. You wouldn't believe the stories.

        • mathgorges 9 hours ago

          This is fascinating to me as an ex-mainframer that now works on a niche hyperscaler. I would love to learn more!

          Will you let me know some of the names in the space so that I can research more? Some cursory searching only brings up some questionably relavent press releases from IBM.

          • yourapostasy 8 hours ago

            Sounds like they’re talking about running IBM Wazi on Red Hat OpenShift Virtualization. As far as I know, there isn’t a System z-on-a-container offering, like you install from a Helm Chart or comes to you from an OCI registry. If it is the IBM I know, it’s completely out of reach of most homelab’ers and hobbyists.

            IBM Wazi As A Service is supposed to be more affordable than the self hosted version and the Z Development and Test Environment (ZD&T) offering. ZD&T is around $5000 USD for the cheapest personal edition, so maybe around $2500-3500 USD per year?

        • danielmarkbruce 12 hours ago

          Yup, but the COBOL application doesn't know you've done that.

      • Muromec 13 hours ago

        A different kind of cloud you can say.

        • danielmarkbruce 13 hours ago

          ha yes. There is actually a pretty cool product that is made by a division of Rocket Software named "AMC", it takes a COBOL app running on an IBM system and deploys it to a whole set of services on AWS. There are some smart dudes at that shop.

          • Muromec 13 hours ago

            Doesn't surprise me at all, somebody out there should be smart enough to make good money on that and not be very loud about it either.

      • WesleyJohnson 9 hours ago

        We're running RM/COBOL on RHEL8 VMs backed powered by VMware. I don't work with it, I'm in a different IT area, but our COBOL codebase supports the lion's share of our day-to-day operations.

  • 9659 13 hours ago

    Ada is an order of magnitude more modern and sophisticated than your other examples.

    I expect Ada will capture 0.05% of the market for the next 100 years.

    • tombert 12 hours ago

      Fair, I guess the list was “languages that I know were popular at one point but I don’t know anyone really using now”.

      Ada definitely does seem pretty cool from the little bit I have read about it. I’m not sure why it’s fallen by the wayside in favor of C and its derivatives.

      • aidenn0 12 hours ago

        Ada was mandated by the DoD for a bit. My understanding is that, in practice, this involved making a half-hearted effort in Ada, failing and then applying for a variance to not use Ada.

        • actionfromafar 12 hours ago

          Often, I'm sure, but there are large code bases in Ada still. It's a shame, it looks like a really great language I would love. But it's a chicken and egg problem. If only Mozilla had decided on Ada instead of Rust! :-)

          • cyberax 8 hours ago

            Ada doesn't offer any safety for dynamic memory. In fact, Ada is now adopting Rust's approach with the borrow checker.

            • actionfromafar 3 hours ago

              Great! Time to jump on the Ada bandwagon then! ;)

        • hardburn 11 hours ago

          I actually met a programmer who worked on military jets. According to her, Ada is only used anymore for the older jets that were already programmed in it, and she worked in C++.

          • greenavocado 10 hours ago

            Military jets coded in C++. God help us all.

            • FpUser 10 hours ago

              No need to be so dramatic. Shitheads will make software fail in any language. Memory "safety" will not help you correctly and in timely manner calculate position of flight controls for example.

              • User23 9 hours ago

                One can write reliable, and I mean airtight good enough for medical devices and nuclear deterrence, in basically any even vaguely modern language (think Algol-60 or later). It’s simply a matter of disciplined design and running on hardware that’s sufficiently predictable.

          • 9659 8 hours ago

            yes, this is true. mainly due to a perceived lack of ada programmers on the market.

    • wbl 7 hours ago

      The one shop that really used it is now open to C++ and I expect Rust. But their projects tend to last a long time: 3 generations have flown in one of them, etc.

    • 7thaccount 12 hours ago

      Ada is pretty cool, but not sure if any more modern than APL. Both are actively maintained and useful in different areas.

      • int_19h 2 hours ago

        Ada has seen quite a few major features added to it in the past couple of decades.

    • thayne 9 hours ago

      Modern fortran is actually fairly modern too. But most fortran codebases aren't modern fortran, they're Fortran 77. If you're lucky.

      • atrettel 9 hours ago

        I agree that many modern Fortran codes aren't truly "modern" Fortran, but in my experience most codes have at least been ported to Fortran 90, even if they largely keep a lot of Fortran 77 baggage (especially the type system and indentation!). In all of my experience, I've really only encountered a single Fortran code being used currently that is actually Fortran 77 in the flesh. That said, I still think many Fortran codes would benefit from using more modern features, since so many are stuck in the past and are difficult to maintain for that reason.

        • jsrcout 6 hours ago

          The codebase I've been working in lately is mostly pre-77 FORTRAN, maintained as such for all this time. "Stuck in the past" is an apt description.

        • thayne 6 hours ago

          Perhaps I should have said "originally written in f77", and still look like it.

  • ecshafer 8 hours ago

    Fortran is pretty nice to write in if you are just writing numerical stuff. If I were just doing a pure numerical simulation, I would rather do it in fortran than c++ or python (without numpy which is just fortran and c++)

  • eslaught 9 hours ago

    I wrote a small program in Algol 68 once. It was horrible because it didn't even have heap allocation in the language, so things you'd think of doing in C (e.g., tree data structures) just didn't work. That and all the compiler errors were pure numerical codes which you had to go look up in the manual (not provided). And forget about getting line numbers.

    I am very much glad I wasn't alive at the time this was the state of the art.

  • Mc91 13 hours ago

    I program an Android app for a Fortune 100 company. Last commit where someone edited a Java file was last week.

    Most of the new code from the past few years has been in Kotlin though.

    • Muromec 13 hours ago

      This. Nobody wants to have the COBOL problem again, so the developer hiring money follows the programming language popularity market (with a certain regulatory approved laf ofc)

      • psjs 11 hours ago

        “laf” or “lag”?

        • Muromec 3 hours ago

          Lag of course. Math doors only open once in 25 years, you know the drill.

  • adastra22 13 hours ago

    Fortran is not a legacy language.

  • Suppafly 13 hours ago

    >but is there any reason to think that in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

    It feels like we're getting into that space already.

    • Muromec 13 hours ago

      Nah not really. People just started replacing COBOL with java and employers are wise enough to hire people who are 30-40 years minimum from retirement.

      It can also be upgraded in smaller chunks and finding enough developers for the tool is an important metric corporate is looking at.

      If anything, banks are actively optimizing for developer experience to make sure 60% of new hires don’t run away in the first year. If anything, banks are better at navigating those kind of structural risks, they were just slow on undertaking such risks exist.

      If you have an episode of existential anxiety because of dat AI eating mijn job, getting a union job in a bank is a way to hedge this particular risk.

      • rightbyte 30 minutes ago

        > employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Uhm... loyalty is punnished and workers need to change jobs to keep 'market rate' wages. So dunno about that.

        I think it is more about that newcomers to the job market are easier to abuse.

      • gwd 13 hours ago

        > ...employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Um oh yeah, the reason we're hiring 20-year-olds is because we want to ensure we have lifelong support for the new system we're writing. Not because they're cheaper, they're still idealistic and naive, they'll work long hours for foosball tables and stacks, or anything like that...

        • Muromec 13 hours ago

          In a place where you can imagine having COBOL, working long hours is frown upon and being idealistic beyond personal integrity isn't a good quality either. Not saying such places aren't cheap, as of course they are. Being cheap is their exact point.

      • User23 9 hours ago

        > employers are wise enough to hire people who are 30-40 years minimum from retirement.

        Well I hope they’re wise enough to not let any good employment attorneys catch wind because that’s blatantly illegal.

        • Muromec 3 hours ago

          It's not a requirement, but the outcome of hiring results demographics wise is very visible.

    • strken 10 hours ago

      I think Android makes a difference here. Sure, a lot of people are on Kotlin, but a lot aren't.

  • Yodel0914 8 hours ago

    I’m not sure I’d choose to use Fortran, but at one point I had to maintain an app that had a Delphi UI and Fortran business logic. The Fortran, although spaghetti, was much less frustrating to work with.

  • karlgkk 10 hours ago

    > in 30-40 years the only people writing Java will be retirees maintaining old banking systems?

    I kinda suspect that if Java is still around in 30 years, what we call Java will be - at best - vaguely recognizable.

  • jtolmar 8 hours ago

    I can't say whether Java as a whole will ever become the next COBOL, but Java 8 already is well on the way there.

  • RickJWagner 12 hours ago

    IBM offers a free COBOL plugin for VSCode and a nice tutorial with it.

    I started programming in COBOL (circa 1990) and took the tutorial just for fun earlier this year.

masfoobar 28 minutes ago

condolences to the writer on his grandads passing.

It is a bit of a reality check when words like 'grandpa' are linked to an article from 1992! My brain is expecting the article to be from the 60's, 70's... or possibly 80's.

My world view, it is hard to image a child born in 2000 is 24 years old now. Their grandparents could be as old as I if they had children (and their children) at a young age.

Then I read at the end he was 91 when he passed. He did well! Likely around my Grandads age - and managed to last an extra 24 years on this planet!

I remember reading a book on COBOL in my younger days learning to program, alongside BASIC, C, and Pascal. I might still have it. Despite reading and never coding in it, I have been (fortunate, I guess) to have never programmed in it.

I do agree with the writer that using the word "dead" in the programming language world is unrealistic. Some would argue that there are popular, modern languages out there as being "dead" - but they might get a huge push for one reason or another in the future. Could COBOL find a new, niche spot.

Maybe.

kukkeliskuu 7 hours ago

Cloud is the new mainframe, except worse. It has all the downsides, but does not have the biggest upside.

The grandpa could create (using CICS), a very reliable and performant service that would call other services inside the same transaction. The platform would handle all the complicated stuff, such as maintaining data integrity.

Try to write AWS Lambdas that call each other within the same transaction.

  • sofixa an hour ago

    > It has all the downsides

    Vendor lock-in from a single vendor? Wildly expensive capex and opex? Impossibility for people to know any of the tech involved without you sending them on a course to learn about it or them already having experience with it?

    > Try to write AWS Lambdas that call each other within the same transaction.

    Why is that your comparison? Was deploying to the mainframe as simple as throwing a .zip with your code at an API that you could give access to developers?

msla 14 hours ago

"I don't know what the language of the year 2000 will look like, but I know it will be called Fortran." —Tony Hoare

COBOL is alive in that it keeps changing from era to era, to the point modern COBOL looks rather little like the 1950s COBOL everyone instinctively thinks about when they heard the term. It's as if we were still programming in Algol because Java had been called Algol-94 or something.

  • Animats 14 hours ago

    Nobody writes

        MULTIPLY A BY B GIVING C ON SIZE ERROR STOP RUN.
    
    any more.
    • graypegg 14 hours ago

      I mean, if you squint your eyes a bit, that could be SQL! So even if it's not COBOL, there's people out there writing in a vaguely english business programming language.

      • tannhaeuser 13 hours ago

        So you spotted that? I have no proof or links to share, but I've always thought SQL was inspired by, or at least made to not look out of place next to COBOL. I recall COBOL coding card layout interpreted a flag on punch cards at the char column where top-level picture clauses needed to start specifically for designating a line as SQL for static embedded SQL preprocessing.

        • DaiPlusPlus 13 hours ago

          I think it’s more that computers at the time didn’t all have lowercase characters. Consider that even C and C++ supported trigraph/digraph compatibility chars until something like last year (and IBM still complained…):

      • throw-the-towel 12 hours ago

        And SQL kinda dates from the same era, I wonder if this type of language was in vogue 50 years ago?

        • tdeck 5 hours ago

          The only notable similarities I see are lack of special characters, all caps by default (most languages from this era are actually case insensitive), and using English words. Those characteristics were in vogue 50 years ago because many computers didn't support lowercase characters, and the set of non-alphanumeric characters supported tended to vary a lot between machines. Here's what the Jargon File had to say about EBCDIC, for example:

          > EBCDIC: /eb´s@·dik/, /eb´see`dik/, /eb´k@·dik/, n. [abbreviation, Extended Binary Coded Decimal Interchange Code] An alleged character set used on IBM dinosaurs. It exists in at least six mutually incompatible versions, all featuring such delights as non-contiguous letter sequences and the absence of several ASCII punctuation characters fairly important for modern computer languages (exactly which characters are absent varies according to which version of EBCDIC you're looking at).

      • Suppafly 13 hours ago

        seriously sometimes writing SQL feels more like composing a google query than programming.

        • jl6 13 hours ago

          A great thing about being a programmer is getting to complain about the crappy requirements you have to work with. SQL, on the other hand, is not a program - it’s a precise specification of the result you want, in a format that lets the database engine write the “program” for you. Thus, writing SQL helps you appreciate the struggle to achieve good requirements, and there is a chance you will develop empathy for those cursed to write them.

          • int_19h 2 hours ago

            That can be said of any program written in a pure declarative language, but even so not all of them look like SQL. And, yes, they are still programs.

        • Ekaros 3 hours ago

          Well, it is in the name. Structured Query Language. And I would argue that it is very often right mind send. You are trying to query data, not process it. Thus actually making it query seems rather reasonable paradigm.

      • zozbot234 14 hours ago

        The nice thing about a vaguely English like language is that your average LLM is going to do a better job of making sense of it. Because it can leverage its learnings from the entire training set, not just the code-specific portion of it.

        • kibwen 13 hours ago

          Not for generating it, because the more it looks like prose the more the LLM's output will be influenced by all the prose it's ingested.

          • crackez 11 hours ago

            I've used o365 copilot to analyze a COBOL app I had source code to, and it was great at explaining how the code worked. Made writing an interface to it a breeze with some sample code and I swear I am not a COBOL person, I'm just the Linux guy trying to help a buddy out...

            It also does a reasonable job of generating working COBOL. I had to fix up just a few errors in the data definitions as the llm generated badly sized data members, but it was pretty smooth. Much smoother than my experiences with llm's and Python. What a crap shoot Python is with llm's...

    • Smar 14 hours ago

      > print food if tasty?

      Ruby is nice.

      • zdragnar 11 hours ago

        Maybe I'm in a minority, but I genuinely dislike conditions placed afterwards.

        They feel great to type out when you're in the flow, but coming back and reading them grates on my nerves. Seeing the condition first means I load a logical branch into my mental context. Seeing the condition after means I have to rewrite the context of what I just read to become part of a logical branch, and now the flow of reading is broken.

        • User23 8 hours ago

          Try thinking of it as prefix if and infix if?

          And in any event it’s a very natural language pattern if you know what I mean.

    • analog31 10 hours ago

      Man that's almost like Hypercard.

    • kernal 7 hours ago

      >Nobody writes MULTIPLY A BY B GIVING C ON SIZE ERROR STOP RUN.

      You had me at MULTIPLY A BY B

  • 9659 13 hours ago

    This was almost true in 2000. It is not true now. Things change. Slowly.

  • j0hnyl 14 hours ago

    But are these legacy systems from the 70s, 80s, 90s using modern cobol?

    • NikolaNovak 12 hours ago

      Depends what you mean; but not necessarily.

      I am managing an ERP system implemented / went live in 2016. It's working on modern P10 hardware, which was released in 2021. The ERP system is continually updated by the vendor and customized by the client.

      Even for COBOL running on an actual mainframe, which I think most HNers would think of 1970s dinosaur, most of the actual machines in production would be pretty new. IBM z16 was launched in 2022.

      So they are "legacy systems" in the sense they're not written on a javascript framework which was launched last week, running on lambda instances in AWS :). But they are not "OLD" systems, as such.

    • jcranmer 13 hours ago

      Almost certainly yes. The "legacy systems" are likely running on versions of the mainframe with long-term support contracts, whose vendors are committed to providing newer compilers with support for newer versions of the specification as necessary.

    • ithkuil 14 hours ago

      When you hear about people being paid $X vs 10x$X to fix some cobol; is there a correlation between the age of the cobol system?

      • HeyLaughingBoy 13 hours ago

        Probably not; just a matter of how desperate they are.

        • ithkuil 2 hours ago

          Which is also a function of how hard it is to find someone who has the required skills to address the problem

  • MathMonkeyMan 12 hours ago

    More accurate might be "I don't know what the language of 2000 will be called, but I know it will look like Fortran."

martinclayton 4 hours ago

In case anyone is interested...

The SO Developer Surveys give some info on the job market for COBOL as it appears on the average salary versus years-of-experience graphs, which I like as there's as many stories or reasons as you can think of to explain them.

In 2023 there were 222 respondents who averaged 19 years of experience, and an average salary of $75,500. In 2024 the exact number of respondents is not shown, but likely similar based on the color code of the point, but the average experience had dropped to 17 years.

Elsewhere in the graph my favourite open question is: how come the over 2000 respondents mentioning Swift average over 11 years experience in a language that's only been public for 10 years?

2024 https://survey.stackoverflow.co/2024/work#salary-comp-total-...

2023 https://survey.stackoverflow.co/2023/?utm_source=so-owned&ut...

  • clarle 4 hours ago

    iOS development has been around for quite some time now. Most senior iOS and Cocoa developers probably started with Objective-C before slowly migrating codebases over to Swift.

    • martinclayton 3 hours ago

      I think this must be it, or at least this is one story that fits.

      Seems a shame that people report Objective-C experience as Swift experience to such a great extent. These surveys are not resumes...

      Perhaps it just "proves" that all data in these charts is questionable.

palisade 13 hours ago

Note: I'm getting some hate from others who think I would pick or prefer COBOL over a modern language. I wouldn't. I was making an outside-the-box "devil's advocate" objective observation. I just wanted to preface that here. Okay, the rest of my original comment remains below:

The irony is that we already had a memory safe and stable language in Cobol that was easier to read and understand than Rust. But, no one wants to use it so it is "dead" but it runs everything that made the modern age possible.

RUST:

println!("Enter number: ");

let mut input_string = String::new();

io::stdin().read_line(&mut input_string).unwrap();

let number: i32 = input_string.trim().parse().expect("Please enter a valid number.");

let result = if number % 2 == 0 {

    "EVEN"
} else {

    "ODD"
};

println!("The number: {}", result);

COBOL:

display 'Enter number: '

accept number

if function mod(number,2) = 0

    move 'even' to result
else

    move 'odd' to result
end-if

display 'The number: ',result

  • sestep 13 hours ago

    This is a weird take. Sure, plenty of cool/nice things from old languages (e.g. variable-sized stack frames in Ada) get lost, and some then get rediscovered by future languages, potentially wasting effort. And I don't know COBOL, so maybe you're actually making a good point.

    But I find that hard to believe. Does COBOL really solve all the same problems Rust is intended to solve? Is it as performant? Can it interface with native code from other languages in the same way? Does it have a usable and sane package manager built on top of a module system that facilitates composability and backward compatibility? Does it have a way to describe the shape of data and errors as ergonomically as Rust's algebraic data types?

    Genuinely curious: as I said, I don't know COBOL. I'd find it extremely surprising if the answers to all these questions are "yes," though. Just as there are reasons COBOL is still used, there are also (good) reasons new languages have been created.

    • palisade 12 hours ago

      A lot to unpack in this question.

      Do they solve all the same problems? No, for example COBOL lacks a modern concept of concurrency within a single program. COBOL's concurrency features are based on task-level parallelism, which involves dividing a program into multiple tasks that can be executed concurrently.

      Is it performant? Yes. COBOL is highly efficient particularly in handling large datasets and complex business logic and its compilers are optimized for reliability and speed.

      Can it interface with native code? Yes.

      Does it have a package manager? No.

      Does it describe shape of data? No. Data structures in COBOL are defined using fixed-length records.

      Note: I'm not a COBOL expert. I did learn it in college, though.

    • Muromec 13 hours ago

      Imagine having a shell script being called from a cron job that writes data in a bunch of tab separated memory mapped files (memory mapping happens when you configure the thing), but you have more files than memory. And all the shell scripts call and include each other and have global variables too.

      And that underpins most of the critical infrastructure in your country.

      • User23 8 hours ago

        Except mainframe IO and interrupts actually work reliably. Unix on the other hand is a proud member of the worse is better club. It still doesn’t really handle interrupts correctly, but thanks to 40 years of kludges most people consider it close enough.

  • kibwen 11 hours ago

    It's a bit odd to say these programs are comparable when the Cobol version isn't handling errors whereas the Rust program is (by panicking, but that's better than the silently wrong behavior of the Cobol one). Here's a runnable version of the above Cobol program (adding the necessary boilerplate); note that it prints "even" for an input of `abc` and "odd" for an input of `12`:

        identification division.
            program-id.
                even-or-odd.
        data division.
            working-storage section.
                01 num pic 9.
                01 result pic x(4).
        procedure division.
            display 'Enter number: '
        
            accept num
        
            if function mod(num, 2) = 0
                move 'even' to result
            else
                move 'odd' to result
            end-if
        
            display 'The number: ', result
        stop run.
    
    It's peculiar to call out Rust's syntax specifically when, like most other languages these days, is mostly C-like (though with a sprinkling of OCaml). And syntax aside, Rust and Cobol have wildly different goals, so "just use Cobol" doesn't suffice to obviate Rust's purpose for existing.
    • palisade 11 hours ago

      Good catch! My cobol is rust-y. :D

      I guess my post is getting misread as "just use cobol" when it was more of a XKCD-like reflection; e.g. why did we all do that / keep doing that. We done did Cobol, and Rust. And, one is "dead" but not really and now here we are.

      https://xkcd.com/927/

  • erik_seaberg 11 hours ago

    Sorry, https://www.ibm.com/docs/en/cobol-zos/6.2?topic=statement-ex... seems to be demonstrating a language that is not memory-safe (maybe it used to be, but how?)

      COMPUTE SIZE-NEEDED = LENGTH OF OBJ + LENGTH OF VARTAB * NUM-ELEMENTS
      ALLOCATE SIZE-NEEDED CHARACTERS INITIALIZED RETURNING VPTR
      SET ADDRESS OF VARGRP TO VPTR
      MOVE NUM-ELEMENTS TO OBJ
      MOVE BUFFER(1:SIZE-NEEDED) TO VARGRP
      SET VPTR TO ADDRESS OF BUFFER
      FREE VPTR
    • palisade 11 hours ago

      The compiler would have rejected that, if I remember correctly. I'm not in the field of cobol myself, I learned it briefly in college ages ago.

  • Muromec 13 hours ago

    Shell script is memory safe too, but you don't write anything longer than 100 lines in it for a reason.

    • palisade 13 hours ago

      When you bank, COBOL (40% of online banks). When you use the ATM, COBOL (95% of ATM transactions). When you travel, COBOL (96% of airline ticket bookings). Healthcare, COBOL. Social Security, COBOL. Point of Sale, COBOL. IRS, COBOL. Pension funds? COBOL. Hotel bookings? COBOL. Payroll programs? COBOL.

      It is estimated that there is 800 billion lines of COBOL code in production systems in daily use. That is a bit more than 100 lines.

      This was why Y2K genuinely scared everyone and was a very real problem. The only reason we can look back at it and laugh now is that an army of engineers sat down and rewrote it all in the nick of time.

      • wglb 10 hours ago

        The Y2K effort was much more nuanced than this. I was there for it and it was more like highly targeted patching based on carefully crafted test sets and frameworks.

        > army of engineers sat down and rewrote it all in the nick of time.

        No way did all get rewritten. Where source was available, fixes were applied and systems retested.

        True drama ensued for programs for which the source was no longer obtainable.

        The company I was at during that time had programs that had been in production since at least 1960.

        The other effort that took place was attending to the systems during the midnight boundary with everybody either in the office or on call.

        The other strong observation was that the risks were very much not understood, with exaggerations both extreme and dismissive. Also not discussed in the popular press at the time was the extent that most of these systems were not truly totally automated.

      • Muromec 13 hours ago

        I'm a big enjoyer of arcain arts, but I happen to work in a place that actually has it and no -- nobody likes COBOL and it's not cool in any sense.

        • palisade 12 hours ago

          Well, there is a good reason no one likes it. It isn't cool, I completely agree. Readable, simple, safe, performant and still relevant though? Ya.

          • Muromec 12 hours ago

            >Readable, simple, safe, performant and still relevant though?

            It's performant, you can't take away that.

      • arcticbull 13 hours ago

        Legacy code yeah, nobody's hitting File > New Project in COBOL

        It's just that nobody understands how the systems work and they're ossified. Those systems are going to be emulated until our grandchildren take over because nobody can understand them well enough to craft a replacement. Juuuust up until an LLM rewrites them for us.

        [edit] I mean those airlines systems are so old that they don't support special characters on names, passenger names are two fixed-length fields (first name, last name) and title and middle name just gets appended together.

        So you get LASTNAME/FIRSTNAMEMIDDLENAMENTITLE on your bookings. And each of those fields is truncated lol.

        and of course flight numbers are fixed at 4 digits, so we're running out of those.

        Not exactly a great ad.

        • toast0 12 hours ago

          "Legacy code" is also known as "the important code that makes the business work"

          If these new fangled languages are so great, one day they can be legacy code too. :P

          • Muromec 12 hours ago

            That's not what makes something legacy. Legacy is something highly not advisable to change because it's both makes the business work and can't be easily changed because of complexity, loss of context, high blast radius or whatever. It's just there and you have to deal with it. If it wasn't complex, opaque and scary to touch it would not have been just another piece of something to be replaced and updated like the copyright date in the footer.

        • palisade 13 hours ago

          Oof, I've got good news and bad news for you.... they still are creating new code in it.

          Yeah, there are fewer engineers in COBOL which is why it pays BIG bucks now. They desperately need someone to maintain that massive infrastructure that has been built up over 75 years that cannot be replaced easily or quickly.

    • duskwuff 11 hours ago

      Besides - standard COBOL is only "memory-safe" by way of not supporting dynamic memory allocation. Like, at all. Even strings are stored in fixed-length arrays.

      "A ship in harbor is safe, but that is not what ships are built for."

  • hollerith 12 hours ago

    Bizarre comment. No developer who should be allowed anywhere near a computer would ever consider choosing COBOL where Rust is appropriate or vice versa.

    • zozbot234 12 hours ago

      Agreed. It's easy to have memory safety when you don't even support heap allocation. Now if OP had said "Java" or "C#" instead of "COBOL", they would've had a solid point. But the way Rust ensures memory safety without mandating GC while still allowing for complex allocation patterns can be said to be practically unfeasible for any of the usual "legacy" languages, with the notable exception of Ada.

    • palisade 12 hours ago

      Well, I said it was ironic that we went out of our way to make a newer more complicated to read language that was memory safe when we already had a language that was simpler and readable that was safe.

      I didn't say I wanted to code in it, though. I'd prefer in no particular order Kotlin, Python, Go, C++, Rust, Perl, C#, Java, Zig, etc. Anything really over COBOL myself. I'm part of the problem.

      But, if I was hard up for money and wasn't getting nibbles for jobs? I could see getting into COBOL because there is a lot of money in it and always work available.

      My statement stands though, we need to do better when designing the syntax of our languages. Cobol is disliked, yet simple and readable. What does that say about our new languages. How hated are our "new" language remnants going to be when few of us are longer around to maintain them 50 - 75 years from now? And, how easy are they going to be to pick up?

      Addendum: I guess it won't matter if the singularity comes and just writes it all for us, of course. Then it will all just be machine code and we won't need these "only human" translation layers any longer.

      • strken 10 hours ago

        Is COBOL actually memory safe in the same way Rust is memory safe? I thought it was just "we don't allow dynamic allocation", and I'd assume programmers often implement their own half-baked dynamic allocation on top.

        • rightbyte 21 minutes ago

          Just like Rust the 'use after free' problem becomes the 'use after it does not make sense' problem instead. Which Valgrind wont find for you either.

          I think new Cobol has 'allocate' and 'free' though.

    • 7thaccount 12 hours ago

      I don't think the use cases for Cobol (bank software) typically overlap with those for Rust (operating systems...etc).

      It's like saying no gardener should be allowed near a garden that would choose a shovel over a pair of shears. Both have a place.

snovymgodym 13 hours ago

As always, these discussions will depend on your definition of "dead" and "alive".

If we can call a technology dead once no new business is built on it, then I think we can safely call COBOL dead (and the IBM 390x aka Z/OS platform along with it, for which "COBOL" is usually a proxy).

But if we say that anything still being used in production is not dead, then of course COBOL is alive and significantly more alive than many other things which are younger than it.

But this shouldn't really be taken as a positive point for COBOL or the mainframe ecosystem. It's simply a fact of life that organizations tend to stick with the first thing that works, and for the types of entities involved in the first wave of digitalization (e.g. governments, banks, airlines) that was usually an IBM mainframe along with the software that runs on it.

  • 8fingerlouie 4 hours ago

    > and the IBM 390x aka Z/OS platform along with it

    The problem with killing the mainframe is that no other platform really exists that can handle the amount of simultanous IO that you can get on a mainframe. Our mainframe easily processes 100m transactions per hour, with room to spare. And keep in mind that those transactions are for the most part synchronous, and will result in multiple SQL transactions per transaction.

    Yes, eventual consistency is a thing, but it's a very bad match with the financial world at least, and maybe also military, insurance or medical/health. You can of course also partition the workload, but again, that creates consistency issues when going across shards.

    Also, COBOL is far from dead, but it's slowly getting there. I don't know of a single bank that isn't actively working on getting out of the mainframe, though all projections i've seen says that the mainframe and COBOL will be around until at least 2050.

    Give that a thought. That's 26 years of writing COBOL. Considering that COBOL programmers are also highly sought after, and usually well paid, one could literally still begin a career as a COBOL programmer today and almost get a full work life worth of it.

  • DaiPlusPlus 12 hours ago

    > we can call a technology dead once no new business is built on it

    You don’t suppose any bank - or other large financial institution - might have standardised on Cobol for their core business flows/processes? In which case a new business-unit or “internal startup” team (e.g. a new category of insurance product) might very-well have some part written in Cobol so it integrates with the rest of the bank - or at very-least might be built-on-top of the org’s existing Cobol-running infrastructure (i.e. Not written in Cobol, but still runs on Z/OS because there’s no budget for buying new commodity x86 racks and the people to manage and run them).

    • snovymgodym 11 hours ago

      Sure, I know for a fact that what you're describing exists. That's not really what I mean by new business being built on it. That's a case of a very large and old business already being so locked into the mainframe ecosystem for their core systems that anything new they try to do ends up needing some kind of integration system with the legacy system.

      What I mean is that nobody starts a business today and says "Ok, we need an IBM mainframe running DB2 and we'll have a bunch of COBOL, ReXX, and PL/I programs for handling our business logic".

    • makeitdouble 8 hours ago

      I was under the impression that banks with core COBOL processes all had an intermediate layer in Java/C# to deal with these kind of integration.

      We saw exactly the case of a new business unit being created, and like most other units it wouldn't get direct access to the lowest layer, and interact instead with a saner level of API and modules in the language of their stack.

      • jamesfinlayson 6 hours ago

        Yeah that my impression too - I haven't worked in banking but I've worked at a few places with core functionality written in Fortran and then web-facing API layers on top of that (some was in Java in think).

  • calibas 6 hours ago

    COBOL is undead.

gpraghu 12 hours ago

A touching article! I have enjoyed similar times with my grandpa. On the topic of Cobol, I simply don't understand why people hate it so much. It has a shallow learning curve like Python, is self-documenting enough that one doesn't need to write a bunch of text, and is available on every conceivable architecture, with great performance. I personally wrote a payroll for an entire factory on a B1800 with 128K of memory and a 10MB hard disk! So what's to complain? In my mind, Java is deader than Cobol!

  • ape4 12 hours ago

    Its the amount of boiler plate that people hate.

    • acdha 12 hours ago

      I think there’s something to that but there’s also a lot of selectivity there. Many of the same people who complained about COBOL because it was verbose adopted things like enterprise Java, so there’s more than a suggestion that this might be a less than completely objective assessment.

      The bigger problem: COBOL was an open standard but none of the implementations were open source for ages (I haven’t looked at GNU COBOL in years, but I think this is no longer the case) so nobody was building new things or experience when they had to pay to get started.

cwbriscoe 14 hours ago

Started my career doing Y2K stuff in 1998 and I still touch COBOL here and there today. I have a 10,000 line CICS program that runs every 30 seconds that I wrote in 2010. It has never failed since.

  • supportengineer 14 hours ago

    That's what I liked about developing Oracle stored procedures activated by cron jobs. Ran for 5 years, no maintenance needed.

    • lloydatkinson 13 hours ago

      That seems like a low barrier of expectations. I can think of several DB's that would run exactly like that.

  • bdjsiqoocwk 14 hours ago

    I don't understand these "never failed" comments. Without further context, it's meaningless. If I write a python script and never change anything in its environtment or inputs, it won't fail either. That's not specific to cobol.

    • _old_dude_ 14 hours ago

      COBOL changes very slowly, once in a decade or two. Python does not offer support of a release for more than 3 years and a half [1].

      [1] https://en.wikipedia.org/wiki/History_of_Python

      • 0cf8612b2e1e 14 hours ago

        I could believe there are legacy installations happily humming away on Python 2.7 without issue.

        • remlov 13 hours ago

          Several years ago I briefly worked at a major telecommunications provider with services across the southern United States that ran Python 2.4 on their production provisioning servers. Worked just fine.

          • gavindean90 7 hours ago

            The difference being that the COBOL is still supported after a decade.

            • int_19h 2 hours ago

              ActiveState still offers a supported Python 2.7 version across all major platforms for those who need it (https://www.activestate.com/products/python/python-2-7/), so that's 14 years and counting.

              If enough stuff needs it, people will keep it running. Java 8 will probably be in the same boat eventually if/when Oracle finally drops support.

      • yieldcrv 14 hours ago

        But a compute instance or bare metal computer that never needs a new release wont have to deal with that in python either

        Its only new builds on someone else’s computer that have this modern issue

    • tannhaeuser 12 hours ago

      I understand the context to be that COBOL, as a DSL for batch processing, declares its .data and .bss segments, or the equivalents on host systems, statically in the DATA DIVISION and usually doesn't dynamically allocate memory. This, coupled with CPU, memory, and I/O bandwidth reservation from a job scheduler on an exclusive hot-swappable partition on a host (z/OS aka MVS) plus USVs, redundant disks/disk ports, and networks makes "never fail" much more a consequence and primary objective of mainframe architectures where COBOL workloads are usually run.

    • kibibu 14 hours ago

      I imagine the backwards compatibility story of COBOL is a little better than Python's

    • ang_cire 14 hours ago

      If you're actually patching your python installs, that is by no means certain.

      • andreasmetsala 14 hours ago

        I don’t think those mainframes running COBOL are getting patched either.

        • p_l 13 hours ago

          They are patched up regularly. The COBOL code itself maybe not, but the runtimes?

        • ang_cire 13 hours ago

          They absolutely are. Modern COBOL (even 25 year old COBOL) isn't running on ancient System360/370/390s or something, it's running on modern z/OS mainframes.

    • Spooky23 6 hours ago

      If the python script has external dependencies… lol.

adamc 14 hours ago

Technologies die very slowly once things of economic value depend on them. COBOL probably isn't used from new projects very often, but the economics of ditching it aren't very good either. It already works. Rewriting things that work is a great way to create new problems at great expense, so institutions are hesitant.

socketcluster 12 hours ago

It's interesting reading articles from previous generations how they make it sound like people seem to remember what everyone in the tech industry said as if everyone mattered. I guess there weren't many people around in the industry back then.

Nowadays, even if someone is right about something and most people are doing it wrong, nobody will care to even discuss it unless the person making the statement is one of maybe 3 top influencers in that field.

mbloom1915 14 hours ago

almost all major financial institutions, utilities, gov't agencies, etc still rely heavily on COBOL today. If it ain't (extremely) broken, don't fix it?

COBOL developers are literally dying out which has made for a competitive market for remaining talent. I've heard of some large consultants charging over $500/hr to their clients for a COBOL developer!

  • akavi 14 hours ago

    I feel like every time COBOL is mentioned we get these stories about crazy high comp for COBOL developers, but anecdotally my aunt worked on COBOL projects in the mid 2010s and was paid a much more modest 45 $/hr. Good money for small town middle America where she lives, but nowhere close to what a decent JS dev can get.

    • chucksmash 9 hours ago

      There's also the difference between what a consulting company bills for the COBOL developer and what they pay the developer. Not every consultant is the captain of their own ship.

      My first job after college was a software shop organized in a "services" model, where clients would have to sponsor teams to do feature dev or support beyond initial onboarding. It's been a long time and my memory is hazy, but as I recall I was expected to bill ~40 hours a week to clients and if I only worked 40 hours that week (being OT exempt, this was always the goal), my hourly pay came out to between 10-20% of what the company billed the client.

      So $500/hr on the bill and $45/hr on the paycheck both manage to sound plausible, even at the same company.

    • ghosty141 14 hours ago

      Similar experience with a friend of mine. I feel like these high salaries only apply to people who habe worked at one of these companies for a looong time.

      • Ekaros 3 hours ago

        High salaries are when a grey beard consultant is bought in for a few months to fix something or implement some new regulation. And I don't think it was that high was ever as high as financialised tech companies managed.

      • adastra22 13 hours ago

        High salaries are relative. $90k is a high salary for most people in the world, even for tech workers outside of Silicon Valley.

  • psunavy03 14 hours ago

    In COBOL implementations, it's generally not just knowledge of the language that makes you valuable, it's knowledge of the implementation at that particular organization. I'm not a COBOL dev myself, but I work with them, and part of the challenge is that everything is so uber-customized, tightly coupled, and there's 40+ years of undocumented business logic buried in the code.

    It's like the old joke about the engineer being asked for an itemized bill: "Chalk mark: $1. Knowing where to put it: $4,999."

  • bespokedevelopr 13 hours ago

    I work for a major utility and they used to run everything on mainframe and cobol but that went away long before I started programming. My coworker is nearing retirement, around 30 years here, and he started on cobol and worked on transitioning off. He has some really fun stories but my point being, the tales of cobol prevalence are very exaggerated. Maybe some parts of finance are still using it, not my area.

  • jeremyjh 14 hours ago

    I think the moat that COBOL developers have is not just their knowledge of the language, but knowledge of the mainframe programming and operating environment. Its just so alien to developers familiar with Windows/Linux, and there is really no way to get experience with the environment that I know of, other than to be employed doing it.

    But yeah that stuff is never going away as far as I can tell. Its just too risky to rewrite those core systems and many a boondoggle has tried and failed.

    • rodgerd 14 hours ago

      About a decade ago I looked into moving some COBOL components off-mainframe (either as COBOL-on-Linux or a rewrite into Java, which itself is really COBOL Mk II at this point), and your point about the operating environment is one of the key elements, but not all of it; there's also the fact that the first big shift to automation, via mainframe assembler and COBOL, is when companies sacked a lot of the folks who knew how and why the pre-automation processes worked - that knowledge exists in the mainframe code and the heads of the people who work(ed) on it, and nowhere else. A rewrite or a replatform is very, very hard and risky as a result; the system is now defined by how the mainframe runs the processes, to a very large degree.

      The third is that COBOL is only the tip of the iceberg. As soon as I spent time learning about the code I was being asked to look at, you get into decades of evolving programming practises. Modern COBOL is multithreaded, probably uses DB2 and relational datamodels. COBOL from thirty years ago is probably single-threaded, only runs right on high-clocked single-execution models, cuts down to hand-written s390 assembler regularly, and uses VSAM files with non-relational data. Older code still will be sharing data simply by banging it into memory regions for other code to read out of, because that's how you got performance back in the day.

      Trying to identify how you'd pull a function out of that and move it off is somewhere between extremely difficult and impossible. It's usually so complicated and expensive it's easier to try and hire people who want to apprentice as mainframe programmers and keep the current codebase running.

      • mschuster91 13 hours ago

        > A rewrite or a replatform is very, very hard and risky as a result; the system is now defined by how the mainframe runs the processes, to a very large degree.

        And that's why so many neo-banks/fintechs are eating the lunch of the established banks left and right, same for insurance. The "old guard" is unwilling to pay the costs of not just upgrading off of mainframes (aka the rewrite work itself)... but of changing their processes. That is where the real cost is at:

        When you have 213.000 employees like BoA has and everyone needs to have at least 10 hours of training and 2 weeks until they're familiar with the new system enough to be fully productive, that's like 2 million man-hours just for training and 16 million hours in lost productivity, so assuming $50/h average salary it's around 900 million dollars in cost. Unfortunately for the dinosaurs, the demands of both the customers and (at least in Europe) regulatory agencies especially for real-time financial transfers just push the old mainframe stuff to limits, while at the same time banks don't want to cede more and more of that cake to Paypal and friends that charge quite the sum for (effectively) lending money to banks.

        In contrast, all the newcomers start with greenfield IT, most likely some sort of more-or-less standard SAP. That one actually supports running unit and integration tests automatically, drastically reducing the chance of fuck-ups that might draw in unwanted regulatory attention.

        • jeremyjh 13 hours ago

          BOA doesn't train the vast, vast majority of its workforce on mainframe systems these days. No one working in a branch or call center is looking at green screens anymore. The mainframe systems are simply used as back-ends connected through web services (yes, even in CICS!) or MQ Series and the like to web GUIs.

          Source: worked there for many years, and built some of those integration systems.

        • panopticon 9 hours ago

          Eh, I think the tech stack is less important than the legal and regulatory structure.

          Most fintechs aren't banks and partner with a Real Bank™ to provide the actual bank accounts. Fintechs are under much less regulatory scrutiny (for now—that may be changing with recent, high-profile screwups) and can move with much more freedom regardless of the tech stack they've chosen.

    • psunavy03 13 hours ago

      Migrations are still a thing, with various approaches and success rates.

  • amelius 14 hours ago

    Can't we just apply a bunch of correctness preserving translations towards a modern PL, perhaps aided by an LLM to keep the source as human readable as possible, while (I'm stressing this) preserving correctness?

    • exhilaration 14 hours ago

      IBM offers just such a service under the WatsonX branding, it's an LLM to convert COBOL to Java: https://www.ibm.com/products/watsonx-code-assistant-z

      I work at a company with a large COBOL codebase and this has been mentioned in a few presentations about our modernization efforts.

      • grammie 13 hours ago

        You should take a look at my company. Heirloom Computing. Heirloom.cc We have migrated many mainframe application and millions of lines of cobol and pl1 into Java and deployed it into production on prem and into the cloud.

      • russfink 12 hours ago

        But is the conversion maintainable by a human? I’ve seen Fortran to C translators that end up encoding state transition machines that are impossible to read.

      • refneb 14 hours ago

        How did that go? My employer is going to try snd evaluate watsonx product. Have you had any luck converting large/complex COBOL modules ?

    • Muromec 13 hours ago

      You can’t, unless you transform cobol to cobol and run the emulator on aws. It will still manage to fail you in some way

  • jillesvangurp 6 hours ago

    Not a bad gig to take if you can swallow your pride a bit.

    I bet LLMs can make working with COBOL a lot easier and more fun than it ever was. I bet that's true for a lot of legacy stuff.

    • the_af 5 hours ago

      Working with COBOL was never fun, so that's a low bar.

      Like others have said, what's valuable is an understanding of the business and legacy cruft that comes with spending time working at this kind of companies/banks/etc rather than knowledge of COBOL.

  • Spooky23 6 hours ago

    That’s for specialists for the mainframe or specific skill.

    Generalists are usually offshored and are cheap.

  • IshKebab 14 hours ago

    That seems like a myth to me. I actually looked up COBOL salaries and they were a bit higher (like 20%) but definitely not enough to make them tempting.

    • francisofascii 14 hours ago

      There is typically a big difference between a consultant's hourly rate and a full time salary hourly rate.

      • IshKebab 3 hours ago

        Yeah exactly. I was comparing like for like (contracts or full time). The difference due to the fact that it was COBOL was definitely not enough to make me want to learn COBOL.

  • Muromec 13 hours ago

    It is very much broken and said institutions don’t like it

  • the_af 14 hours ago

    COBOL jobs are not particularly well paid in my country.

    In any case, they would have to pay well by a large margin to justify working on dead boring legacy systems, too.

FLT8 3 hours ago

20 years ago I worked on a mainframe system that, at the time, was said to have "18 months to live". Fast forward to today, the system is more entrenched than it ever was, and still has "18 months to live".. I'm convinced it will outlive me, and probably the next generation too.

deenadz 13 hours ago

COBOL is dead, long live COBOL.

For any cobol devs here, we at https://cobolcopilot.com would love to hear from you

  • Muromec 13 hours ago

    You need to sell on-prem to those people. No way a single byte of that sweet sweet poison is going to ever leave the corporate network boundary.

WaitWaitWha 13 hours ago

(Programming) languages take very long to "die". Most often you will get a long drawn out tail, and often parts of a language gets absorbed into other languages. Only the sages and etymologists will know where they have come from.

Old man reminiscence following, skip if you are not bored:

I worked with SNOBOL and I thought it will be a long term programming language. I also want to think that I had some tiny, minuscule hand in dev of RIPscrip pre-Telegraphix, alas it went as the dodo bird.

I think I have forgotten more programming languages than I can count on my hands. Yet, I see them in some part every day in newer languages, "discovered" by some expert. "What has been will be again, what has been done will be done again; there is nothing new under the sun."

One language has come to my aid for the last 30-ish years Perl has came to my aid many times.

(I tell you a secret - in the deep deep bowels of a a very, very large, jungle named company, servers still have tiny Perl scripts running some core functions. I discovered this, when there was a problem that I had to deep dive into. I a recommendation to change to a hard-coded variable. The answer was "it will take two weeks". Why? Because no one knew what it will do or could read Perl. It was a 30 second job, including sdlc. Think xkcd Dependency https://xkcd.com/2347/ )

solatic 5 hours ago

COBOL is endangered, even for banks and airlines. Just look at the executives who see decide to open new digital banks - they're not building on top of COBOL or mainframes. The old banks will be outmaneuvered by the new ones, and eventually succeed them in the market.

The story of languages like COBOL isn't that a language is too deeply embedded to become too expensive to replace. It just means the replacement will happen at a higher level - the business itself, and will take more time as a result.

  • nasmorn 5 hours ago

    A single cobol mainframe application is not a problem for a bank. Big banks are usually made by buying up dozens of other banks so they might have very many of these mainframes running and interoperating. That is where the real insanity lies

ryukoposting 8 hours ago

When I was in college, I knew a guy who got an internship at Wells Fargo writing COBOL. He hated it.

The punchline is that this was in 2018.

bigiain 5 hours ago

This makes me feel old.

In '92 I was maintaining COBOL code for a custom written warehouse management system for a wholesale boat bits distributor. The company that wrote it had lost almost all their COBOL devs, and were all in on Windows NT application dev.

I hate to admit it to myself, but I am in fact _just_ old enough that I could have cs grad aged grandkids, if I'd had kids early and they'd also had kids early. :sigh:

facorreia 11 hours ago

I worked for a company in the late 1980s that started developing with a 4GL product (Dataflex) instead of COBOL. The article is right that COBOL has outlasted most (all?) of those 4GL solutions.

Looking back, COBOL would have been a better technical choice back then. Dataflex's metadata-based dynamic UI and report generation saved some simple, repetitive work, but much more effort was wasted working around its limitations.

nrollinson an hour ago

COBOL's gone? Time to tell grandpa his coding skills are officially retro chic.

mcv 11 hours ago

Just this week a colleague asked if someone knew Cobol. Apparently another team had a Cobol-related issue.

So despite its long death, it still seems to be kicking about. I doubt we'll ever get rid of it.

happyjim 13 hours ago

Key components of the U.S. Internal Revenue Service tax processing code (e.g., the "Individual Master File" or IMF) are written in COBOL and IBM Assembly Language.

There is an ongoing effort to refactor as Java. This will ultimately take years and cost $100s of millions of dollars. There is still a small but shrinking team of graybeards who can actually maintain the code, which has to be reprogrammed every year to accommodate changes to tax code.

See, e.g., IRS IT Strategic Plan documents, publicly available.

HackerQED 8 hours ago

RIP. He is an old man with wisdom and a sense of humor.

kayo_20211030 12 hours ago

Great story. There's something wicked personal in it, and it's very good. I reckon that this bloke's grandfather was an interesting bloke - cobol or no.

LarsDu88 11 hours ago

As long as there are tactical nukes that depend on COBOL, COBOL ain't dead.

We might all die, but COBOL will sit happy in its steel reinforce nuclear bunker

  • diggan 11 hours ago

    Still doesn't beat Assembly, which will continue running on Voyager 1 even after the inevitable demise of our planet. Would survive the end of our solar system too.

    • LarsDu88 7 hours ago

      Assembly ain't a language. Differs for every chip microarchitecture. Doubt there's many folks who know voyager 1 assembly

yawnxyz 11 hours ago

huh so are any languages actually dead? ChatGPT mentions FORTRAN, ALGOL, or Pascal... which I don't think are dead at all.

Ada I've never heard of, so maybe that one's dead?

If they're able to write WebAssembly compilers for all these languages, then they'll probably live forever!

The only reason punchcards are "dead" is bc the machines are gone or mostly unavailable...

  • int_19h 2 hours ago

    It depends on how you define "dead". ALGOL proper has been dead for many decades, but pretty much all mainstream general purpose PLs today are its direct descendants, and sometimes this ancestry is plain to see (e.g. every time you write "struct" or "void" in a C-like language, that's straight from ALGOL 68). I once wrote a comment on HN summarizing all the various bits and pieces I know of that are still around: https://news.ycombinator.com/item?id=18691821

  • marcolussetti 11 hours ago

    Ada is still updated, last released in 2023. Given its original audience is the Department of Defense, it seems to me very likely it is far from dead.

sshine 12 hours ago

I know someone my age (mid-late 30s) who is a COBOL programmer for a bank.

He's been that for ~5 years.

I don't think it's going away any time soon.

Crontab 10 hours ago

Do open source COBOL programs exist? Just wondering since I see it mentioned occasionally here.

sys_64738 10 hours ago

I recently found a 3.5” disk image I had with my 1990 COBOL programs on it.

  • iefbr14 an hour ago

    You are lucky. I started in '75 and my first cobol programs were on punch cards. Maybe some bits are still going round in napkins and toilet paper..

mckn1ght 14 hours ago

Huh, so it mentions 4GLs… what generation would we consider rust/kotlin/swift then?

  • jcranmer 12 hours ago

    The idea of programming language generations were based on paradigms of programming that never really caught on. The idea, roughly, is that 3GL are those languages where you specify how something is to be done, 4GL is where you specify what is to be done instead, and 5GL is you specify the problem and the computer does everything for you.

    This breaks down with the fact that it's really difficult, outside of really constrained spaces, to turn a "what" specification into a high-performance implementation, so any practical language needs to let you give some degree of control in the "how", and as a result, any modern language is somewhere uncomfortably between the 3GL and 4GL in the paradigm, not fitting entirely well in either category.

  • stonethrowaway 14 hours ago

    They haven’t been around long enough to even be considered in the running.

    • adamc 14 hours ago

      4GL was really more a marketing slogan than a true generation. The idea was something like "with third generation tools, you have to drive the car, making every turn yourself, with 4th Gen., you say "Go to the Ritz!".

      It wasn't true, although they did make some operations easier in tangible ways.

      Rust is a wholly different kind of thing -- not easier than, say, Java, but lots more control with better guarantees. It's more a systems programming language. 4GLs were all application-focused.

  • rodgerd 14 hours ago

    The modern analogue of 4GLs would be the promise of LLMs letting you write prompts so you don't have to learn a programming language; the promise of the original 4GLs like Pearl (not to be confused with perl) and Objectstar was to let you have non-programmers writing business logic without being COBOL or FORTRAN programmers.

    • psunavy03 13 hours ago

      Ironically, the whole reason COBOL has its weird-ass syntax was to let you have non-programmers writing business logic without being assembly or C programmers. We can see how well that worked.

      • acdha 11 hours ago

        I think about that every time I hear someone saying LLMs will make programmers unemployable. There’s no doubt that the work will change but I think a lot of the premise is based on a fundamental misunderstanding of the problem: business systems are just more complex than people like to think so you’re basically repeating https://xkcd.com/793/ where people higher on the org chart think the problem is just cranking out syntax because they “know” how it should work.

        I think we’ve had at least 4 generations of that idea that reducing coding time will be a game-changer: the COBOL/SQL era of English-like languages promising that business people could write or at least read the code directly, 4GLs in the 80s and 90s offering an updated take on that idea, the massive push for outsourcing in the 90s and 2000s cutting the hourly cost down, and now LLMs in the form being pushed by Gartner/McKinsey/etc. In each case there have been some real wins but far less than proponents hoped because the hard problem was deciding what it really needed to do, not hammering out syntax.

        There’s also a kind of Jevons paradox at work because even now we still have way more demand than capacity, so any productivity wins are cancelled out. At some point that should plateau but I’m not betting on it being soon.

Frummy 10 hours ago

It's tragicomical, since it's at the core of renowned institutions I thought surely this must be a world of logical, crisp perfection. A perfectly engineered engine, surely if these systems are so important and at the very center of what makes society work and all flow of money and whatever, geniuses must have perfected it all thrice-over. I wouldn't say reality was equal to 1/expectations^3 , but maybe 1/expectations^2. Probably no one will relate, a COBOL job was the first developer job of a relatively young guy like me. Crash course in tech-debt, decades worth of managerial shortsighted behavior, bureaucracy and all that. At least the naive hope provided energy to learn it better so it wasn't useless. But maybe it veered on delusion when I hoped to rewrite ALL of it in the company I was.

MarkusWandel 11 hours ago

Frankly, in all these stories about COBOL programs being modified for Y2K and whatever... isn't COBOL a compiled language? What's really amazing is that all these legacy systems have buildable source code and the toolchain to build them with i.e. that that stuff hasn't suffered "bit rot" or other neglect.

blastonico 13 hours ago

Soon after Java was released, when the hype around the language was on fire, people used to say that it was going to replace COBOL - due to the "build once run everywhere" motto. Java indeed gained market share in the finance industry, but COBOL is still there.

throw0101b 14 hours ago

Bloomberg's Odd Lots podcast had an episode last year, "This Is What Happens When Governments Build Software":

* https://www.youtube.com/watch?v=nMtOv6DFn1U

One reason COBOL systems have been around for so long is because they encoded business rules that need to be understood if you want to try to transfer them to a new system. From the podcast (~16m):

> Like when we're working in unemployment insurance, again during the pandemic, my colleague was talking with the claims processors week over week and we're trying to dissect it and figure out what's going wrong and clear this backlog and one of these guys keeps saying, “Well, I'm not quite sure about that answer. I'm the new guy. I'm the new guy.” And she finally says, “How long have you been here?” And he says, “I've been here 17 years. The guys who really know how this works have been here 25 years or more.”

> So think about. You know, going from doing some simple cool, you know, tech app, you know, easy consumer app to trying to build or fix or improve upon a system that is so complex that it takes 25 years to learn how to process a claim.

> That's sort of, I think, what needs to be on the table as part of this agenda is not just “can the tech be better?” But can we go back and simplify the accumulated like 90 years of policy and process that's making that so hard to make?

Also an observation on how decisions are sometimes made:

> And I think that there's a deep seated culture in government where the policy people are the important people. They do the important stuff and technology, digital is just part of implementation, which is not just the bottom of a software development waterfall. It's the bottom of a big rigid hierarchy in which information and power and insights only flows from the top to the bottom.

> And so it's problematic in part because the people who are doing the tech are really just sort of downstream of everything else and the power and ability and willingness to step up and say “Hey, we probably shouldn't do those 6,700 requirements, we should probably focus on these 200, get that out the door and then, you know, add edge cases as as later.” There's no permission really to say that.

  • Muromec 13 hours ago

    > There's no permission really to say that.

    There is not permission to say that because your requirements are often set in a black letter law and you didn't buy a right kind of suite to be present where they were decided for the last 40 years.

  • shadowgovt 13 hours ago

    > ...add edge cases as as later.” There's no permission really to say that.

    I think there would be some value to closing that feedback loop to give legislators the signal "You know, what you're considering is actually pretty fuzzy conceptually... We're discovering while considering how to code it up that you probably don't actually have good, clear definitions for all the terms in this bill." But the biggest thing to remember about government IT is the clientele, which changes the approach from commercial / industry software.

    Google can optimize for the common case. Google can cut the edge cases. Google can change APIs on a whim.

    Google's users choose to be Google's users and can go elsewhere if they don't like it.

    Government citizens don't have that choice. And in general, people don't lose access to their food if Google effs up. Or go without their legally-deserved unemployment pay. Or go to jail because their taxes were mis-calculated.

    In the government space, the "edge cases" are human beings, alike in dignity. The rules and policies end up complicated because human beings are complicated. And yeah, it ends up being some messy software. Because you can't just decide to ignore the law when it's inconvenient to plumb the information that the client has a child under the age of 18 who is not a dependent because they're an emancipated minor, but said emancipated minor does have a child of their own, and the client is the primary caregiver for that child while her parent is in prison... from here to there in the dataset.

    • Muromec 13 hours ago

      >Because you can't just decide to ignore the law when it's inconvenient to plumb the information that the client has a child under the age of 18 who is not a dependent because they're an emancipated minor, but said emancipated minor does have a child of their own, and the client is the primary caregiver for that child while her parent is in prison... from here to there in the dataset.

      That's all very true, but nobody ever codifies that. When the data doesn't fit the constrains of the form that aims to handle a reasonable generalilized case, you simply get a phone call from a human in the loop. That human has a supervisor and you can also go to a court when they write your name with E instead of É and try to bullshit you about some kind of ASCIEBCDIC nonsense like it's real.

      In the end you have one dataset which tells who is a child of who, another telling who has custody rights and a third one making sense of amounts and recipients of childcase subsidies. Maintained by different departments and eventually consistent or maybe not.

    • spongebobstoes 12 hours ago

      My interpretation is a little different. We agree that humans are affected by the edge cases, although I believe that's also true at very large companies like Google or Meta.

      I don't think it's about avoiding programming 6700 edge cases, but more so that when you have an excessive number of cases, it's likely an indication that something is being missed. that could be due to a bug in software or due to unclear definitions in the legislation.

      in those cases, rather than attempting to program it exactly, it might be better to bring a human into the loop.

      and to me, that could be the point of having a tighter feedback loop. because otherwise the developers will just do their best, which will be buggy or incomplete. because they can't not do their job.