So, back in the early 90’s, I built a remote software update process for a system deployed on oil tankers. It worked over modems connected via a satellite telephone link. NASA does similar stuff with their interplanetary probes. These things can be done!
Interesting read...re f35 vs b17 build rates..under war footing, and given similar sized work forces i wonder how quickly F35s could be pumped out. I get some of the processes take longer (carbon fibre, special coatings etc) but once ramped up i suspect modern manufacturing could spit out F35s at an astonishing rates. Remember 10s of thousands of workers were mobilized to build b17s..how many are involved in f35 production?
Agile development is the answer to all things and like many software delivery philosophies it can be done badly. I wish the article went into more details but I'm sure I could go dig into the report (assuming it's public). I just wish folks would stop assuming Agile is the only way to manage software development. It's great when building incremental value and discovering what the end-game actually looks like since you may not know. Like anything it can be be screwed up and that's on people making poor process decisions, maybe to dogmatic to things like scrum of not dogmatic enough...
You make some good points, and I can agree (certainly!) that you can do Agile well, and you can do Agile badly. I posted the above just as a datapoint. I guess I believe that software delivery methodologies aren't the problem, nor are they honestly the solution. If I were to try to pin down just what _is_ the solution, I'm afraid I'd have a tough time. These days, I'm pretty certain it's a mix of the right people, the right metrics, and the right motivations. But that's very hand-wavey.
Back in 1983, a Canadian historian, Gwynne Dyer, created a PBS series called War-A Commentary. In it he said that the next war we'd be in (he was referring to WW3, mind) would be a "come as you are, war" for the same reason you mentioned; it takes too long to replace the very expensive and complex weapons we now use. to conduct a war.
BTW, keep up the good work! I really appreciate the pragmatic view you bring to the world.
Directly comparing 43,000 hours to build an F-35 with a B-17 coming off the assembly line every 4 days isn't a fair comparison. No doubt each B-17 took longer than 96 hours to build and each F-35 spent less than 4.9 years on the assembly line.
Standby for Rant: IMO the downside to the increasing use of AI and similar information processing lies in the dependence it produces in the user. Not so long ago I was in charge of a number of bright young people employed as intelligence analysts. I found that, for them, everything worth knowing came off the magic box and had to be al least SECRET/NOFORN or it wasn’t worth reading. The idea of reading a book was alien to them and going to the library was seen as goofing off. Attending the Geospatial Analysis course I found that it was all about computer based using systems, e.g. Falcon View in those days. The idea of actually analyzing something, pulling together disparate information to form conclusions wasn’t even mentioned. When I attended an INSCOM exercise this was regarded as an extraordinary idea. Possibly worth looking into. Briefing on drug trafficking I pointed out a possible node. A CIA guy jumped up and demanded to see the message traffic that supported it. When I explained that it was an analytical conclusion, not something that had been specifically reprted he seemed quite put out. Then, as the designated fool, while assigned to Special Forces I tried to sell the importance of OSINT. When my guys put together some OSINT stuff relevant to current ops I got a nasty gram from the G3 (Operations) to “quit sending that unclassified stuff".
OK, rant now concludes.
In this connection I recommend the article The Pentagon’s Silicon Valley Problem” in Harpers for March, 2024.
Alot of American Software is very much mis-understood, poorly copycatted. Funny story, back in 2002 I was not cleared on the base for rummy's visit so I was put in a blind shade room with a German painter tasked with doing some touch up painting. Neither of us were cleared so we got shoved into a back office while he visited. I risked it, and peered and saw his walk across the lawn. This was at the George C. Marshall Center
You're not wrong in identifying the inefficiency, but there is so much overall friction in the system I think it's quite a stretch to call a win or loss based on that. Oftentimes it's the decision making process that's the biggest holdup in deploying software for the government, at least in my experience.
Though it would certainly be impactful to be able to identify and exploit gaps in your adversaries' systems before they can realize it's happening and respond. And to close any of your own. What such gaps, if any, exist? How long and to what extent can it be exploited to for productive purposes? Just because we know that something *can* be a problem doesn't mean that it will be, or necessarily predict the impact one way or the other.
You could be right about that. I get a little annoyed with the talk of drones, sometimes, because I believe that electronic warfare will make it more difficult to use drones.
That’s kind of a problem. It’s possible to do AI, but it’s going to be a very difficult leap to move to an automated weapon system that can decide to kill.
Or if we just have AI that does ISR, that’s fine, but it still has to broadcast the intelligence back to a command center. That still puts out an electronic signature. I don’t think that artificial intelligence is a free lunch.
The other problem with AI is that currently in the field we like to take the practice that 90% confidence when inferencing is good enough, since this isn't life or death.
Moving that needle from 90% to 99.999% confidence is exponentially more expensive. AI inferencing with such large models then will require better chipsets (GPU, TPU) and these aren't that small. Putting a "mobile" AI inferencing device embedded with any ISR capability means adding more compute. Even adding it to a tethered drone on a local LAN adds complexity and bulkiness, sure it can help the operator to identify or be aware of something but I think we are years away before this become trusted.
Yeah, it’s one thing to put it on a destroyer and say “shoot everything coming from that direction.” It’s another to ask it to discriminate between a farmer with a shotgun who is just protecting his sheep and an insurgent.
So, back in the early 90’s, I built a remote software update process for a system deployed on oil tankers. It worked over modems connected via a satellite telephone link. NASA does similar stuff with their interplanetary probes. These things can be done!
Reminds me of reasons behind the Navy's C2C24 effort https://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=10501
Interesting article.
American history in a nutshell
1. Looming Obvious Problem
2. Americans: "Pshh, whatever"
3. Oh my God! I just got punched in the face! Who could have seen this coming?
Interesting read...re f35 vs b17 build rates..under war footing, and given similar sized work forces i wonder how quickly F35s could be pumped out. I get some of the processes take longer (carbon fibre, special coatings etc) but once ramped up i suspect modern manufacturing could spit out F35s at an astonishing rates. Remember 10s of thousands of workers were mobilized to build b17s..how many are involved in f35 production?
Oh! You might be interested in Nicolas Chaillan's good-bye letter as well. https://www.linkedin.com/pulse/time-say-goodbye-nicolas-m-chaillan/
https://www.theregister.com/2021/03/25/f35_gao_report_fy2020_software_woes/
F-35, and the Airforce, did use devops techniques to attempt to speed delivery, and weren't successful.
Agile development is the answer to all things and like many software delivery philosophies it can be done badly. I wish the article went into more details but I'm sure I could go dig into the report (assuming it's public). I just wish folks would stop assuming Agile is the only way to manage software development. It's great when building incremental value and discovering what the end-game actually looks like since you may not know. Like anything it can be be screwed up and that's on people making poor process decisions, maybe to dogmatic to things like scrum of not dogmatic enough...
You make some good points, and I can agree (certainly!) that you can do Agile well, and you can do Agile badly. I posted the above just as a datapoint. I guess I believe that software delivery methodologies aren't the problem, nor are they honestly the solution. If I were to try to pin down just what _is_ the solution, I'm afraid I'd have a tough time. These days, I'm pretty certain it's a mix of the right people, the right metrics, and the right motivations. But that's very hand-wavey.
Back in 1983, a Canadian historian, Gwynne Dyer, created a PBS series called War-A Commentary. In it he said that the next war we'd be in (he was referring to WW3, mind) would be a "come as you are, war" for the same reason you mentioned; it takes too long to replace the very expensive and complex weapons we now use. to conduct a war.
BTW, keep up the good work! I really appreciate the pragmatic view you bring to the world.
Cheers!
Directly comparing 43,000 hours to build an F-35 with a B-17 coming off the assembly line every 4 days isn't a fair comparison. No doubt each B-17 took longer than 96 hours to build and each F-35 spent less than 4.9 years on the assembly line.
Standby for Rant: IMO the downside to the increasing use of AI and similar information processing lies in the dependence it produces in the user. Not so long ago I was in charge of a number of bright young people employed as intelligence analysts. I found that, for them, everything worth knowing came off the magic box and had to be al least SECRET/NOFORN or it wasn’t worth reading. The idea of reading a book was alien to them and going to the library was seen as goofing off. Attending the Geospatial Analysis course I found that it was all about computer based using systems, e.g. Falcon View in those days. The idea of actually analyzing something, pulling together disparate information to form conclusions wasn’t even mentioned. When I attended an INSCOM exercise this was regarded as an extraordinary idea. Possibly worth looking into. Briefing on drug trafficking I pointed out a possible node. A CIA guy jumped up and demanded to see the message traffic that supported it. When I explained that it was an analytical conclusion, not something that had been specifically reprted he seemed quite put out. Then, as the designated fool, while assigned to Special Forces I tried to sell the importance of OSINT. When my guys put together some OSINT stuff relevant to current ops I got a nasty gram from the G3 (Operations) to “quit sending that unclassified stuff".
OK, rant now concludes.
In this connection I recommend the article The Pentagon’s Silicon Valley Problem” in Harpers for March, 2024.
Clearly you should definitely also be working on a zeppelin project given the blimp experience on your resume.
Alot of American Software is very much mis-understood, poorly copycatted. Funny story, back in 2002 I was not cleared on the base for rummy's visit so I was put in a blind shade room with a German painter tasked with doing some touch up painting. Neither of us were cleared so we got shoved into a back office while he visited. I risked it, and peered and saw his walk across the lawn. This was at the George C. Marshall Center
You're not wrong in identifying the inefficiency, but there is so much overall friction in the system I think it's quite a stretch to call a win or loss based on that. Oftentimes it's the decision making process that's the biggest holdup in deploying software for the government, at least in my experience.
Though it would certainly be impactful to be able to identify and exploit gaps in your adversaries' systems before they can realize it's happening and respond. And to close any of your own. What such gaps, if any, exist? How long and to what extent can it be exploited to for productive purposes? Just because we know that something *can* be a problem doesn't mean that it will be, or necessarily predict the impact one way or the other.
You could be right about that. I get a little annoyed with the talk of drones, sometimes, because I believe that electronic warfare will make it more difficult to use drones.
Yet , we are banking on drones.
But AI....
That’s kind of a problem. It’s possible to do AI, but it’s going to be a very difficult leap to move to an automated weapon system that can decide to kill.
Or if we just have AI that does ISR, that’s fine, but it still has to broadcast the intelligence back to a command center. That still puts out an electronic signature. I don’t think that artificial intelligence is a free lunch.
The other problem with AI is that currently in the field we like to take the practice that 90% confidence when inferencing is good enough, since this isn't life or death.
Moving that needle from 90% to 99.999% confidence is exponentially more expensive. AI inferencing with such large models then will require better chipsets (GPU, TPU) and these aren't that small. Putting a "mobile" AI inferencing device embedded with any ISR capability means adding more compute. Even adding it to a tethered drone on a local LAN adds complexity and bulkiness, sure it can help the operator to identify or be aware of something but I think we are years away before this become trusted.
Yeah, it’s one thing to put it on a destroyer and say “shoot everything coming from that direction.” It’s another to ask it to discriminate between a farmer with a shotgun who is just protecting his sheep and an insurgent.