The Saving Advice Forums - A classic personal finance community.

AI - whats it going to do?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AI - whats it going to do?

    All,

    Okay, so, there is enough evidence, AI is here to stay. It's massively more efficient at processing information and creating content than any individual human is, and getting better a very fast pace.

    Where do you guys see AI going in the next six months? Do you think various sectors of the economy are going to be impacted? If so, which ones?

    And will there be any social changes that come about due to the use of AI? For example, the internet made it possible to work from home, which means societies are less socially cohesive.

    Looking forward to the community's thoughts here.
    james.c.hendrickson@gmail.com
    202.468.6043

  • #2
    Why don't we ask what AI thinks?

    (I plugged your question into Chat GPT)

    Answer:


    Great question. If we keep the time horizon tight (the next six months, not “eventually”), the story is less about sci-fi breakthroughs and more about quiet but compounding adoption.

    Here’s how I see it.
    1. Where AI is likely to go in the next 6 months


    Incremental, not revolutionary—but very visible.

    We’re unlikely to see “AGI moments,” but we will see:
    • Better reliability (fewer hallucinations, better instruction-following)
    • Much tighter integration into everyday tools (email, spreadsheets, CRMs, IDEs)
    • Faster inference and lower costs, which makes AI usable by default, not just experimentally

    The biggest change is psychological:
    AI stops feeling like a novelty and starts feeling like infrastructure.

    People won’t say “I’m using AI” anymore—just like they don’t say “I’m using the internet.”
    2. Sectors most immediately impacted

    1. Knowledge work (already happening, accelerating fast)


    Impact: Productivity amplification, not full job loss (yet)
    • Marketing, copywriting, research, analysis, legal drafting, consulting
    • Junior and mid-level tasks get compressed or automated
    • Senior roles shift toward judgment, synthesis, and client interaction

    Short-term effect:
    Fewer entry-level roles, higher expectations per employee.
    2. Software & IT


    Impact: Fewer developers per project, faster iteration
    • AI-assisted coding becomes table stakes
    • Legacy code maintenance gets cheaper
    • Solo developers and small teams become disproportionately powerful

    Net effect:
    Not “no programmers,” but more output per programmer, which reshapes hiring.
    3. Customer support & operations


    Impact: Rapid automation of first-line interactions
    • AI handles 60–80% of routine inquiries
    • Humans focus on edge cases and relationship repair

    This is one of the clearest near-term displacement areas.
    4. Education & training


    Impact: Crisis + adaptation phase
    • Homework, essays, and take-home assessments are effectively broken
    • Schools scramble to redesign evaluation (oral exams, in-class work)
    • Corporate training adopts AI tutors very quickly

    Expect confusion before stabilization.
    5. Creative industries


    Impact: Polarization
    • Commodity content (stock images, generic writing, basic video) collapses in value
    • Distinctive voice, taste, and brand become more valuable, not less

    Creators who adapt thrive; those who rely on volume struggle.
    3. Social changes we may see (soon, not speculative)

    1. Further erosion of shared reality


    AI-generated content:
    • Blurs what’s human vs synthetic
    • Makes misinformation cheaper and more scalable
    • Increases skepticism toward all media

    This doesn’t mean people stop believing things—it means they believe tribes, not sources.
    2. Work becomes more asynchronous and individualized


    You mentioned remote work reducing cohesion—AI amplifies this.
    • People rely less on coworkers for help
    • More “solo operator” workflows
    • Less informal learning via proximity

    This increases efficiency but weakens social bonds unless organizations counteract it deliberately.
    3. Status shifts from “knowledge” to “judgment”


    Knowing facts matters less.
    Knowing what to ask, what to trust, and what to ignore matters more.

    This creates a subtle social divide:
    • People who can direct AI well
    • People who are directed by AI outputs

    4. Rising anxiety + quiet dependency


    Many people:
    • Use AI constantly
    • Don’t fully trust it
    • Don’t feel they can stop using it

    That combination tends to produce low-level stress rather than excitement.
    4. The biggest misconception to watch for
    “AI will replace humans.”

    In the next six months, the more accurate framing is:
    AI replaces processes, not people—until institutions fail to adapt.

    The danger isn’t the tech itself; it’s:
    • Companies cutting without redesigning work
    • Schools banning instead of integrating
    • Governments reacting slowly and bluntly

    Bottom line


    Over the next six months:
    • Productivity gaps widen
    • Entry-level pathways narrow
    • Institutions lag individuals
    • Social cohesion weakens unless actively reinforced

    This feels less like the industrial revolution and more like the spread of electricity—not dramatic on any single day, but unmistakable in hindsight.

    Curious where you sit on this spectrum: cautiously optimistic, concerned, or something closer to resignation?
    History will judge the complicit.

    Comment


    • #3
      I told chatGPT I feel "Resignation" about AI.

      It said:


      That makes sense—and honestly, it’s a very rational place to land right now.

      Resignation usually isn’t “AI hype panic” or “AI optimism.” It’s more like:
      “This is bigger than me, moving faster than institutions, and I don’t see a clear lever I can pull to meaningfully influence it.”

      A few thoughts that might resonate, not as reassurance, but as framing.
      Why resignation is a common endpoint (and not a failure of imagination)
      1. Speed mismatch
        Humans and social systems adapt slowly.
        AI capability is compounding monthly. That gap produces helplessness.
      2. Lack of agency
        Most people aren’t building the models, regulating them, or even choosing whether to use them. Opting out often isn’t viable without economic cost.
      3. Historical pattern recognition
        We’ve seen this before:
        • Industrialization
        • Globalization
        • The internet
          Those who didn’t ride the wave often got carried by it—roughly.

      Resignation is often just accurate situational awareness without a clear personal strategy yet.
      One subtle shift that is happening (and may matter more than it seems)


      The power imbalance isn’t simply human vs AI.

      It’s increasingly:
      Humans who can shape workflows + AI
      vs
      Humans who must accept workflows shaped by others

      That’s not about being technical. It’s about position.

      People who:
      • Define goals
      • Set constraints
      • Decide what “good enough” means
        retain leverage longer than people who only execute tasks.

      This is why resignation hits hardest for people whose value was previously tied to output, not judgment.
      A realistic, non-optimistic way to think about adaptation


      Not “learn to code,” not “become an AI expert.”

      More like:
      • Reduce how replaceable your work is by templates
      • Increase how much your role depends on:
        • Context
        • Accountability
        • Taste
        • Trust
        • Consequences

      AI is bad at consequences. Humans still absorb them.
      On social cohesion (your earlier point)


      This part is under-discussed.

      AI accelerates:
      • Isolation
      • Individual optimization
      • Reduced reliance on other people

      Societies don’t usually fracture from scarcity alone—they fracture when people stop needing each other.

      That’s the real long-term risk, not unemployment.
      If resignation had a “healthy” version


      It might look like:
      • Not fighting the tide emotionally
      • Not buying into hype
      • Quietly positioning yourself to be hard to fully automate
      • Investing more in real-world relationships than digital leverage

      Less “winning the future,” more “remaining human inside it.”

      If you want, we can talk about:
      • How resignation turns into cynicism vs clarity
      • What kinds of roles seem most psychologically resilient
      • Or whether disengaging partially (mentally or practically) is actually rational

      No pep talk required.
      History will judge the complicit.

      Comment


      • #4
        Yeah, I think its adoption & integration into business, industry, government, education, and society is going to be a slower process than the tech giants, hype-men & the panicked would have us believe. I expect AI applications will continue to proliferate, people will slowly become more familiar & comfortable with it over the next 6 YEARS (not months), and over time those influences will drive certain shifts in the landscape. ua_guy's AI-generated answer seems pretty reasonable as well.

        It's certainly useful, in the appropriate contexts, and I've now gotten to where multiple times a week I'm suggesting that people try working through a task with AI-driven assistance. It's not always the right answer, but sometimes it helps to speed things up. It's all about building effective prompts.

        .... are we gonna end up with members here having AI write all of their posts?!?

        Comment


        • #5
          Originally posted by kork13 View Post
          .... are we gonna end up with members here having AI write all of their posts?!?
          My NOT AI response... lol

          An AI-driven financial collapse of the stock markets is coming, I think. That's my outlook for the next 6-12 months. That says nothing about the future of AI or its viability, but that much of current AI is not currently producing value for what companies are investing in it. Right now, companies are AI-crazy because FOMO / fear of missing out. Once they meaningfully spend and really think about their return and effectiveness of the tool, I think that's the path forward. The reset is going to suck because it's been propping up us working folks' retirement accounts. And more.

          We're taking a thoughtful approach to AI at work and implementing it where it can free up existing labor to work on more important things. This is where knowledge turns into judgement for the human, and AI handles a process. The caveat: We still need highly knowledgeable people to inform good judgements. If AI isn't producing value, we're not interested.

          AI is still digital labor, not physical labor. I think the current level of obsession also starts to die down when people start to understand this isn't the wholesale labor replacement that our corporate overlords hoped for.

          The social part - I see better self-reliance. Maybe some hope that people "doing their own research", if you know what I mean, might be led to better information, although AI can be taught bad information as well. I see a bright future for assisting the elderly, and AI in healthcare, which could enable better social support.
          History will judge the complicit.

          Comment


          • #6
            Originally posted by james.hendrickson View Post
            All,
            Where do you guys see AI going in the next six months?
            Porn. Not that it's not already there, its just going to go deeper. In other words, more digital relationships.

            Comment


            • #7
              Originally posted by kork13 View Post
              .... are we gonna end up with members here having AI write all of their posts?!?
              I suspect we already have had dozens of post.

              Comment


              • #8
                Just say no to AI porn. That, and too much porn in general is a sign someone might want to increase their interpersonal relationships and catch some real sex in the wild.
                History will judge the complicit.

                Comment


                • #9
                  Originally posted by ua_guy View Post
                  Just say no to AI porn. That, and too much porn in general is a sign someone might want to increase their interpersonal relationships and catch some real sex in the wild.
                  Totally agree.
                  james.c.hendrickson@gmail.com
                  202.468.6043

                  Comment


                  • #10
                    I think we’re just going to see more and more of the garbage we’re already seeing. Fake posts. Fake images. Fake videos. Fake articles. It’s already getting harder to scroll Reels or Instagram because of how many of the videos aren’t real. And checking the news online I’m seeing more and more fake stories. I’m also seeing more people using AI to answer questions (just as ua_guy did here) without verifying the info or making any attempt to put it into their own words. Just copying and pasting what AI told them. Misinformation is spreading faster than ever because the AI answers are often wrong. We see it daily in my reselling groups. As a consumer it’s getting harder to shop online because sellers are using AI images and AI descriptions which by definition are artificial.
                    Steve

                    * Despite the high cost of living, it remains very popular.
                    * Why should I pay for my daughter's education when she already knows everything?
                    * There are no shortcuts to anywhere worth going.

                    Comment


                    • #11
                      DS, to your point, If we take a good AI summary of what AI is doing over the next six months, "quiet but compounding adoption", I think the same becomes true of the negatives of AI. "Quiet but compounding rejection". There exists a strong undercurrent to prohibit, remove, or simply move away from those bad things.

                      According to capitalism, if that bad stuff isn't producing value, it should disappear in time. Now that doesn't mean AI won't be used for bad.

                      AI suffers from the same fate as that of the internet. People are affording it trust it doesn't deserve. In 1996, just because someone saw it on the internet doesn't mean it's true. In 2026, just because AI says it (or shows you a picture, or a completely fake video!), doesn't mean it's true. But, here we are...

                      It's funny how history, unfolding in such a short window of time, is repeating itself. AI enables misinformation, so it's not blameless, but I think the bigger problem is we have a lot of dumb people who are easily trained to absorb certain types of information without thinking critically.
                      History will judge the complicit.

                      Comment


                      • #12
                        Governments could easily use it to lie to their citizens or allies.

                        It is my understanding that North Korea routinely releases propaganda about the economic state of other countries. I've seen reports of claims that there are no birds in New York because the starving people ate them all. They could manufacture whole documentaries around their claims.

                        What does it do for hostage negotiation or international prisoners? Instead of Iran having a mock death sentence trial, now they could release a video of you spitting on the koran. Hamas could release videos depicting dead hostages as alive and well. Think of the videos of your favorite politician at Epstein's island.

                        Comment


                        • #13
                          Originally posted by myrdale View Post
                          Governments could easily use it to lie to their citizens or allies.
                          Replace “could” with “do” because it’s already happening in the US and elsewhere.
                          Steve

                          * Despite the high cost of living, it remains very popular.
                          * Why should I pay for my daughter's education when she already knows everything?
                          * There are no shortcuts to anywhere worth going.

                          Comment


                          • #14
                            Originally posted by ua_guy View Post
                            AI suffers from the same fate as that of the internet. People are affording it trust it doesn't deserve. In 1996, just because someone saw it on the internet doesn't mean it's true. In 2026, just because AI says it (or shows you a picture, or a completely fake video!), doesn't mean it's true. But, here we are...
                            Here we are 30 years later, though, and lots of people
                            still think because they saw it online that it’s true. That situation hasn’t improved. I doubt it will improve with AI either.
                            Steve

                            * Despite the high cost of living, it remains very popular.
                            * Why should I pay for my daughter's education when she already knows everything?
                            * There are no shortcuts to anywhere worth going.

                            Comment


                            • #15
                              This is my ignorance ... But is AI actually generating real, directly-attributable profits?

                              Looking back at the Internet in a different way ... A large part of how the dot-com bubble formed was loads & loads of speculative investment capital going into Internet companies that weren't actually generating profits. The bubble burst when investors started asking "where's the profits"? ... then it all started imploding fast.

                              Given that most the biggest competitors in the AI space are already huge players in the tech world (Google, Meta, Amazon, Microsoft, etc.), I don't expect that they'll collapse in the same way if it turns out unprofitable .... But I do think that there's definitely some over-exuberance about AI at the moment.

                              Comment

                              Working...
                              X