A quick summary

  • one of the benefits of LLMs is the ability to summarize things
  • one of the dangers of LLMs is the ability to summarize things
  • read any AI product’s value prop and it will say a variation of “it frees you up to do higher order/ value tasks”
  • nobody knows what these higher order things are
  • our attention is dwindling
  • the ability to focus for >10m appears to be a huge competitive advantage
  • if the author of a book was good, they will have spent months/ years pruning content
  • if I tell you a sequence of notes that comprise a song, did you get the same value as listening to the song? Melody is created through silences (see B.B King). Meaning and message is conveyed in pauses and structure
  • we are in a rush to get to the next thing, almost assuming the next thing is more valuable than the current thing. It usually isn’t
  • this is just a quick summary

Posted:

Categories:

Tags: