In 1986, electronic music pioneer Laurie Spiegel created Music Mouse, a way for those with a Mac, Atari, or Amiga computer to dabble in algorithmic music creation. Music Mouse is deceptively simple: Notes are arranged on an XY grid, and you play it by moving a mouse around. Back in 1986, the computer mouse was still a relatively novel device. While it can trace its origins back to the late ’60s, it wasn’t until the Macintosh 128K in 1984 that it started seeing widespread adoption.
Technology
What’s next for tech in 2024?
Have you ever wondered what the future will look like? Well, you don’t have to wait too long, because 2024 is going to be a year full of amazing innovations that will blow your mind. Here are seven emerging trends and innovations in tech that will no doubt transform our lives over the next year.
CLICK TO GET KURT’S FREE CYBERGUY NEWSLETTER WITH SECURITY ALERTS, QUICK VIDEO TIPS, TECH REVIEWS, AND EASY HOW-TO’S TO MAKE YOU SMARTER
1) Artificial Intelligence continues to revolutionize our lives
AI is everywhere, from our daily gadgets like smartphones and smart speakers, to our smart homes that can adjust the temperature, lighting, and security according to our preferences. But AI is not just making our lives more convenient and comfortable, it’s also set to revolutionize healthcare and other industries with some groundbreaking innovations.
Neuralink’s revolutionary wireless device
N1 implant (Neuralink) (Kurt “CyberGuy” Knutsson)
One of the most anticipated and ambitious projects in this field of AI is Neuralink, a brain-computer interface company founded by Elon Musk, the visionary entrepreneur behind Tesla and SpaceX. Neuralink aims to create a wireless device that can be implanted in the brain and connect it to a computer or a smartphone, allowing users to control devices, access information, and communicate with others just by thinking. Imagine controlling prosthetic limbs or enhancing cognitive abilities just by thinking.
MORE: TOP 10 WEIRDEST TECH INNOVATIONS OF 2023
Keep an eye on Kernel
Woman wearing “mind-reading” helmet (Kernel) (Kurt “CyberGuy” Knutsson)
Neuralink is not the only company working on brain-computer interfaces. Other ones to keep an eye on are Kernel, a neurotech company, which is creating a “mind-reading” helmet that uses sensors and lasers to gain information about the brain’s activity, blood oxygen levels, and more. By collecting detailed data on how the brain works and behaves, the hope is it could lead to new insights and breakthroughs in mental health, aging, cognition, and other aspects of brain health.
THE VERY WORST AND WEAKEST PASSWORDS OF 2023
Meta bets on AI and the Metaverse
Man wearing Meta Quest 3 headset (Meta) (Kurt “CyberGuy” Knutsson)
Meta, the company formerly known as Facebook, is leading the way with its two long-term bets on the future: AI and the metaverse. These two technologies are not only advancing rapidly, but also converging to create new possibilities for human interaction and creativity.
AI is becoming more accessible and powerful than ever, thanks to Meta’s open-source models like Llama and Llama 2, which have been adopted and improved by millions of developers around the world. AI is also becoming more integrated into the products we use every day, such as Instagram, WhatsApp, and Messenger, where you can generate images, chat with assistants, write better, and edit photos with ease.
The metaverse is also taking shape, with Meta’s Reality Labs developing new devices and platforms that enable immersive and social experiences in virtual and augmented reality. The Ray-Ban Meta glasses were the first step towards a future where AI can see the world from our perspective and help us navigate it.
Ray-Ban Meta glasses (Meta) (Kurt “CyberGuy” Knutsson)
The Meta AI assistant is a new kind of companion that will be launched in 2024. It can understand and respond to your voice, vision, and gestures. Since Meta is trying to play catch-up with ChatGPT, it hired a couple dozen big-name celebrities to be the look and voice of Meta’s new AI voice assistant. The celebs will embody the AI and play them.
Meta AI assistant (Meta) (Kurt “CyberGuy” Knutsson)
2) Augmented reality is taking us to new dimensions
Woman wearing Meta Quest 3 headset (Meta) (Kurt “CyberGuy” Knutsson)
Augmented reality, or AR, is the technology that overlays digital information and images on the real world, creating a mixed reality experience. AR is taking us to new dimensions, as companies like Apple, Meta, Snapchat, and Niantic are creating immersive and engaging experiences that enhance our perception and interaction with the world. But don’t think AR is just for gaming and entertainment, it’s also for education and learning. AR can make learning more fun and interactive, as it can bring subjects and concepts to life.
Girl wearing a Meta Quest 3 headset to learn to play the piano. (Meta) (Kurt “CyberGuy” Knutsson)
Augmented reality is bringing shopping to you
Virtual try-on feature uses AR. (Amazon) (Kurt “CyberGuy” Knutsson)
Another domain where AR is making a big impact is shopping. AR can help you make better and more informed decisions, as it can let you try on products, see how they look or fit, and compare different options. Amazon and Walmart are already leveraging AR to provide immersive shopping experiences with its virtual try-on feature. AR is reshaping our world, and we can expect to see more innovation and adoption in 2024 and beyond.
3) Expect advances in bioprinting artificial tissue and organs
Bioprinting research to create organs (MIT) (Kurt “CyberGuy” Knutsson)
Another exciting innovation in healthcare is bioprinting, which is the use of 3D printing to create artificial tissue and organs. Bioprinting could potentially solve the problem of organ shortage and transplant rejection, as well as enable personalized medicine and drug testing. Bioprinting is still in its early stages, but some companies and researchers like those at MIT have already made some impressive progress. AI is truly changing medicine as we know it, and we can expect to see more breakthroughs and applications in 2024 and beyond.
4) Autonomous electric taxi service becomes available
Autonomous electric taxi (Zoox) (Kurt “CyberGuy” Knutsson)
One of the most anticipated technologies in 2024 is the autonomous electric taxi service by Zoox, a subsidiary of Amazon. Their tag line is, “Built for riders – not for drivers.” Zoox has been developing and testing its self-driving vehicles in various cities since 2020, and plans to launch its service in 2024.
Zoox’s vehicles are designed to navigate complex urban environments with four-wheel steering, bidirectional driving, and a spacious cabin that can fit four passengers. Zoox’s taxis can be booked through an app or a kiosk, and offer a flat rate per mile. Zoox aims to provide a safer, more efficient, and more comfortable alternative to conventional taxis, and to reduce traffic and pollution.
5) More drone delivery services in the sky
Amazon’s Mk30 drone (Amazon) (Kurt “CyberGuy” Knutsson)
The year 2024 is expected to witness significant growth in the use of drone delivery services, especially in urban areas where traffic congestion and pollution are major challenges. Drone delivery services offer a fast, convenient, and eco-friendly way of transporting goods and services to customers, reducing the need for road vehicles and human labor.
One of the leading companies in this field is Amazon, which added a third U.S. city that will soon have the option to get their packages delivered by a drone beginning in late 2024. The company’s Prime Air has been using drones to safely deliver packages weighing up to five pounds in one hour or less, for almost a year. Prime Air is also unveiling the new MK30 drone design, which the company claims is quieter, smaller, and lighter than previous models.
MORE: 5 DRONES EXPERT REVIEWED
6) More 3D-printed houses will go up
3D printed houses (Icon) (Kurt “CyberGuy” Knutsson)
Forget about hiring an old-fashioned contractor to build your next home. Imagine having a house built by a 3D printer. More and more of these houses are going to be going up in 2024 as a cost-effective and eco-friendly way of constructing houses.
A company called ICON is a leader in 3D printing technology for construction, with a mission to revolutionize the way we build and live. They have developed a robotic system that can print an entire house layer by layer, using a durable material called Lavacrete, which is a type of concrete that can withstand extreme weather conditions and natural disasters while also reducing waste and emissions. ICON has already printed several houses around the world, including the first 3D-printed community in Austin, Texas. ICON’s vision is to make 3D printing accessible to everyone and to create homes that are beautiful, functional, and resilient.
3D robotic printing technology (Icon) (Kurt “CyberGuy” Knutsson)
MORE: 2023: A YEAR OF INNOVATION AND DISRUPTION IN TECH
7) Increase in electric cars and car-sharing
Cybertruck (Tesla) (Kurt “CyberGuy” Knutsson)
2024 is going to be an exciting one for electric vehicles. According to some experts, electric vehicles (EVs) will account for more than 40% of new car sales in the US by 2024, thanks to the increasing affordability, performance, and environmental benefits of EVs. Whether you’re looking for a budget-friendly Kia Niro, an all-American Ford F-150 Lightning, a futuristic Tesla Cybertruck, a lavish Rolls-Royce Spectre, or a sleek Hyundai IONIQ 6, there’s an EV for everyone.
Car sharing is also expected to grow significantly in 2024, as more people opt for convenient and cost-effective transportation solutions. Some of the leading car-sharing platforms, such as Zipcar, Turo, and Getaround, will offer more options for EV rentals, as well as innovative features such as peer-to-peer sharing, autonomous driving, and smart charging. With electric cars and car sharing, the future of transportation looks bright and green in 2024.
Zipcar app (Zipcar) (Kurt “CyberGuy” Knutsson)
Kurt’s key takeaways
As we look ahead to 2024 and beyond, it’s clear that the world of technology is poised for exciting transformations. Artificial Intelligence, augmented reality, bioprinting, autonomous electric taxis, drone delivery services, 3D printed houses, and electric cars are all shaping a future that promises greater convenience, sustainability, and innovation. These advancements are not just changing industries; they’re revolutionizing the way we live, work, and interact with the world. So, fasten your seatbelts, because the journey into the future of tech is bound to have some bumps in the road.
What technology are you most excited to see or experience and why? Let us know by writing us at Cyberguy.com/Contact.
For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.
Ask Kurt a question or let us know what stories you’d like us to cover.
Answers to the most asked CyberGuy questions:
Ideas for using those Holiday Gift cards
Copyright 2023 CyberGuy.com. All rights reserved.
Technology
Meta’s new deal with Nvidia buys up millions of AI chips
Meta has struck a multiyear deal to expand its data centers with millions of Nvidia’s Grace and Vera CPUs and Blackwell and Rubin GPUs. While Meta has long been using Nvidia’s hardware for its AI products, this deal “represents the first large-scale Nvidia Grace-only deployment,” which Nvidia says will deliver “significant performance-per-watt improvements in [Meta’s] data centers.” The deal also includes plans to add Nvidia’s next-generation Vera CPUs to Meta’s data centers in 2027.
Meta is also working on its own in-house chips for running AI models, but according to the Financial Times, it has run into “technical challenges and rollout delays” with its chip strategy. Nvidia is also dealing with concerns about depreciation and chip-back loans used to finance AI buildouts, as well as the pressure of competition. CNBC notes that Nvidia’s stock dropped four percent after a November report about Meta considering using Google’s Tensor chips for AI, and late last year, AMD announced chip arrangements with both OpenAI and Oracle.
Nvidia and Meta did not disclose how much the deal cost, but this year’s AI spending from Meta, Microsoft, Google, and Amazon is estimated to cost more than the entire Apollo space program.
Technology
Criminals are using Zillow to plan break-ins. Here’s how to remove your home in 10 minutes.
NEWYou can now listen to Fox News articles!
The whole country is watching the Nancy Guthrie case. When the suspected kidnapping happened, I was curious. How long would it take me to find her home address and cell phone number on a people search site?
About 30 seconds.
STOP FOREIGN-OWNED APPS FROM HARVESTING YOUR PERSONAL DATA
I then pasted her address into Zillow and saw photos of her home. I could match what I found to the video from a home tour done on the Today show. I could see the layout. The entry points. The windows. Where her furniture sat. Imagine if I was a criminal armed with that info.
Here’s the thing: I’m not some hacker. I used free websites anyone can access from their couch.
This is happening everywhere
In Scottsdale, Arizona, two teens dressed as delivery drivers forced their way into a couple’s home. They duct-taped and assaulted the homeowners, looking for $66 million in cryptocurrency. They got the victims’ home address from strangers on an encrypted app.
Savannah Guthrie and mother Nancy Guthrie on Thursday, June 15, 2023. (Nathan Congleton/NBC via Getty Images)
In Delray Beach, Florida, a retired couple had their sliding glass door shattered by thieves. The attackers had their home address from leaked personal data. That crew went on to hit victims in multiple states.
Riverside, California, police confirmed detectives routinely find Zillow and Redfin searches on phones seized from arrested burglary suspects.
A former NYPD detective put it bluntly: today’s burglars can case your home from their chair with a cup of coffee and get better intel than they ever could sitting outside with binoculars.
HOW TECH IS BEING USED IN NANCY GUTHRIE DISAPPEARANCE INVESTIGATION
The numbers are scary
Zillow’s database covers over 160 million homes. Listing photos often stay online long after a home is sold. That means photos of your home, taken when you listed it three, five, even 10 years ago, could still be sitting there right now showing every room, every door, every window and exactly where your security cameras are mounted.
Google Street View covers 10 million miles of road worldwide. Criminals use it to check out vehicles parked in driveways, scope backyards and plan escape routes. In some areas, police say thieves are even using drones to peer into windows and check for dogs.
Aerial drone shots of missing person Nancy Guthrie’s home on Tuesday, February 3, 2026 in Tucson, Arizona. Nancy Guthrie, mother of ‘Today’ show host Savannah Guthrie, is suspected of being abducted from her home earlier this week. (Fox Flight Team)
Anyone can type your name into a free people search site and get your home address in seconds. Then they plug it into Zillow and see your floor plan, entry points, window types and where the security cameras sit.
Unless you’re selling your home, take down your photos. Now.
Take it all down in 10 minutes
These steps can look a little different depending on your device, app version or browser. If it’s not exact, poke around. The option is there.
Zillow: Sign in at zillow.com. Click your profile icon > Your Home. Search your address, claim it, then go to Edit Facts and hide or delete the photos. Hit Save.
MAKE 2026 YOUR MOST PRIVATE YEAR YET BY REMOVING BROKER DATA
Redfin: Sign in at redfin.com. Go to Owner Dashboard. Select your home > Edit Photos > Hide listing photos > Save.
Realtor.com: Go to realtor.com/myhome. Claim your home, then select it under My Home > Remove Photos > Yes, Remove All Photos.
Google Street View: Open Google Maps on a computer. Search your address, drop into Street View, then click “Report a problem” (bottom right). Position the red box over your home. Under Request blurring, select “My home.” Submit. FYI, once it’s blurred, it’s permanent. Good.
A member of the Pima County sheriff’s office remains outside of Nancy Guthrie’s home, Monday, Feb. 9, 2026 in Tucson, Ariz. (Ty ONeil/AP Photo)
Pro tip: Ask your old listing agent to pull photos from the MLS. Once they’re gone from MLS, the feeder sites eventually follow.
Also, while you’re at it, search yourself on people search sites like Spokeo, WhitePages and BeenVerified. Most let you opt out. It takes some time per site, but it cuts off the first step criminals use to find you. Better bet is to sign up for Incogni, a sponsor of my national radio show and podcasts.
If you’re not selling, there’s zero reason for the internet to have a virtual tour of your home. Take it down today.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
I guess you could say Zillow gives everyone an open house. Problem is, you never sent the invitations.
Know someone who bought a home in the last few years? Forward this. Their listing photos are probably still online and they have no idea. You can sign up for my 5-star rated newsletter at my website, Komando.com.
Copyright 2026, WestStar Multimedia Entertainment. All rights reserved.
Technology
Legendary composer Laurie Spiegel on the difference between algorithmic music and ‘AI’
By then Spiegel, was already an accomplished composer. Her 1980 album The Expanding Universe is generally considered among the greatest ambient records of all time. And her composition “Harmony of the Worlds” is currently tearing through interstellar space as part of the Voyager Golden Record, launched in 1977. But she is also a technical wizard who joined Bell Labs in 1973 and was instrumental in early digital synthesis experiments and worked on an early computer graphics system called Vampire.
Spiegel was deeply drawn to algorithmic music composition and this new tool, the home computer. So, she created what she calls an “intelligent instrument” that enables the creation of complex melodies and harmonies with minimal music-theory knowledge. Music Mouse restricts you to particular scales, and then you explore them simply by pushing a mouse around.
Spiegel gives the user some control, of course. You can choose if notes move in parallel or contrary to each other, there are options to play notes back as chords or arpeggios, and there is even a simple pattern generator.
Despite being available for purchase until 2021, Spiegel never updated it to work on anything more current than Mac OS 9. Now, 40 years after its debut, it’s getting reborn for modern machines with help from Eventide.
While it would have been easy for Eventide and Spiegel to overload the 2026 version of Music Mouse with countless modern amenities and new features, they kept things restrained for version 1.0. The core feature set is the same, though the sound engine is more robust and includes patches based on Spiegel’s own Yamaha DX7. There are also some enhanced MIDI features, including the ability to feed data from Music Mouse into your DAW or an external synthesizer.
Laurie Spiegel answered some questions for us about the history of Music Mouse, algorithmic composition, AI, and why she thinks the computer is a “folk instrument.”
What were the origins of Music Mouse? Was there something specific that inspired its creation?
When the first Macs came out, the use of a mouse as an input device, as an XY controller, was altogether new. Previous computers had just alphanumeric keyboard input or maybe custom controllers. The most obvious thing I immediately wanted to do was to be able to push sound around with that mouse. So, as soon as the first C compilers came out, I coded up a way to do that. Pretty soon, though, I wanted the sound quantized into scales, then to add more voices to fill out the harmony. Then I wanted to have controls for timbre, tempo, and everything else I eventually added.
How did you connect with Eventide for this new version?
I first met Tony and Richard of Eventide all the way back in the early 1970s. They are longtime good friends. I’d been involved in various music tech projects at Eventide over the years. Tony knew that I really missed Music Mouse and that I still get a fair number of requests for the 1980s versions from people who keep vintage computers from that era just to be able to run Music Mouse or other obsolete software. He decided it was a musical instrument worth reviving. I had been wanting to revive it, but hadn’t been able to find the time to even just keep up with the way development tech keeps changing. My main thing is really composing music, and I have an active enough career doing that to not have enough time to do coding as well. I am extremely grateful to Eventide for resuscitating Music Mouse. I hope a lot of people will get a lot of music out of this new version.
Did you feel compelled to make any big changes to it after 40 years?
We decided to keep 1.0 of this new version of Music Mouse functionally the same as the 1980s original. The exceptions are adding a higher-quality internal synthesizer and providing ways to sync it with other software, to record or notate its MIDI output. We have a growing list of features to add in 2.0.
“It’s pretty easy by now to use computers to generate music-like material that is not actually the expression of an individual human being.”
Are there any current innovations in music tech that excite you?
That’s a hard question, because I am not all that excited about music tech right now. It’s music itself that holds my interest — composition, form, structure. I love counterpoint and the various contrapuntal forms. I studied them extensively when I was younger. Of course, harmonic progression is something I’m also very interested in, and in algorithmic assistance for composing it.
That various kinds of structures within music can now be more easily dealt with in computer software by now has both pros and cons. The pros include how much more deeply we have to understand how music works, how it is structured, and how it affects us, in order to represent it as a process description in software. That means learning, research, and self-discovery. The cons include that it’s pretty easy by now to use computers to generate music-like material that is not actually the expression of an individual human being. Music is a fundamental human experience. There is no human society that doesn’t have it. But it is something that comes from within human beings, as personal expression, as communication, as a sort of form of documentation of what we are feeling, and as a means of sharing it.
You’ve been credited as saying that the computer is a new kind of folk instrument. Can you explain what you mean by that? How does something like Music Mouse fit into that model?
Now that everyone with a computer or even just a phone has the ability to record and edit and play back and digitally process and transform sound, and particularly ever since sampling became a common musical technique, people have been doing remixes, collages, sonic montages… doing all kinds of stuff to audio they get from others or find online. This is very like what we used to call “the folk process,” in which music is repurposed, re-orchestrated, given new lyrics or otherwise modified as it goes from person to person and is adapted to fit what is meaningful in successive groups of people.
Music Mouse will help people create musical materials that can be used in a potentially infinite number of ways. It is a personal, often home-based instrument played by an individual, like a guitar.

You refer to Music Mouse as an “intelligent instrument”; it automates a certain amount of creation. What is the appeal of letting a computer take the wheel to a degree, as an artist?
Music Mouse is not a generative algorithm or an “AI.” It’s a musical instrument that a person can play. It is, to some degree, what we used to call an “expert system,” as it has some musical expertise built in. But that is meant to be supportive for the real live human being who is playing it, not to replace them. It makes the playing of notes easier in order to let the player’s focus be on the level of phrasing or form. I have coded up generative algorithms for music. Music Mouse is not one of them. It’s an instrument that an individual can play, and it’s under their control. It enables a different perspective that’s from above the level of the individual note.
Do you see a connection between modern generative AI and algorithmic composition tools?
Of course. Algorithms can be used to generate music. I have written and used some. Music Mouse is not generative, though. It does nothing on its own. It’s a musical instrument played by a person.
What is currently called “AI” is different from previous generations of artificial intelligence. I expect there will doubtless be further evolution. In the early years of my use of computer logic in composing, AI was more of a rule-based practice. We would try to figure out how the mind was making a specific kind of decision, code up a simulation to test our hypothesis, and then refine our understanding in light of the result. After that, there was a period of AI taking more of a brute-force approach. Computer chess, for example, would involve generating all possible moves possible in a given situation, then eliminating those that would be less beneficial. Then neural nets were brought in for a next generation of AI. I look forward to getting beyond the imitative homogenizing LLM approach and seeing whatever comes next.
There are many ways of designing an algorithm that either generates music or else helps a human being to do that, making some of the decisions during the person’s creative process to leave them free to focus on other aspects. By taking over some of the decision-making, they can free a creative mind to focus on different perspectives. People just starting to learn music too often bog down and give up at the level of simply playing the notes, just figuring out where to put their fingers. We can make musical instruments now that let people use a bit of automation on those low levels to let them express themselves on a larger level, for example, to make gestures in texture-space rather than thinking ahead just one note at a time.
“Music Mouse is not a generative algorithm or an ‘AI.’ It’s a musical instrument that a person can play.”
What do you think separates algorithmically generated music from something created by generative AI?
Artificial intelligence refers to a specific subset of ways to use algorithms. An algorithm is just a description of a process, a sequence of steps to be taken. A generative algorithm can make decisions involved in the production of information, and, of course, music is a kind of information. You can think of AI as trying to simulate human intelligence. It might have a purpose, such as taking over some of our cognitive workload. In contrast, the purpose of generative algorithms is to create stuff. In music, that purpose is to create an experience.
Music Mouse is not a generative algorithmic program. It’s more of a small expert system in that it has built into it information and methods that can help its player get beyond the level of just finding notes, to the level of finding personal expression.
Suno’s CEO Mikey Shulman has said that, “Increasingly taste is the only thing that matters in art and skill is going to matter a lot less.” In an age where music can be easily created using algorithms, plug-ins, and text prompts on cheap laptops and smartphones, do you see the role of composer being one primarily of curation?
I can see where he’s coming from, but, no, I don’t think so. The range and kinds of skills used in the creative arts will continue to evolve and expand. But the history of creative techniques shows them to be largely cumulative versus sequential. The keyboard synthesizer has not replaced the piano, which has not replaced the harpsichord or the organ. We have them all, that whole lineage, all still in use. Each musical instrument or artistic technique implies its own unique artistic realm. Each is defined by its specific limitations, which guide us as we use them. It is true that skills and traditional techniques will be an option rather than a prerequisite to creating music and art, but people will still do them. Just as LPs and chemical film have made comebacks recently, I expect to see traditional musical skills do the same. We have had computers and synthesizers for decades, yet there are still little children captivated by instruments made out of wood or painting or drawing, and I have yet to use any music editing software that gives me the fluidity and freedom of a pencil on staff paper. There will just be more kinds of complementary ways of making music.
More importantly, we humans have imaginations and emotions. There are internal experiences going on inside of us that we feel driven to express, to communicate, to share. It doesn’t matter what machines can generate on their own. We will always have those internal subjective experiences, emotion, and imagination, and people will experience them intensely enough to feel driven to create them external to their own selves in order to communicate and share them. You can’t replace human self-expression or the need for it by simulating their results. Artistic creation comes from a fundamental human drive, the need for self-expression. Artistic creativity is an essential method of processing the intensity of being alive.

You told New Music USA in 2014 that, in regard to electronic music, “There is no single creator… the concept of a finite fixed-form piece with an identifiable creator that is property and a medium of exchange or the embodiment of economic value really disappears.” Does this idea shape your views on ownership of art?
Those assumptions, which we inherited from the European classical model of music, are already much less prominent in our musical landscape. Improvisation, “process pieces,” the ease with which we can do transformations of audio files are all over the place. Folk music, and a lot of what we heard online here and there, might be audio that no longer has any known originator. We don’t know, and people don’t really care, who first created a swatch of sound. We are experiencing whatever has been done with it — different orchestrations, durations, signal processing. The huge proliferation of plug-ins and guitar effects pedals let anyone transform a sound beyond recognition. This is composition on a different level than on the level of the individual note, similarly to Music Mouse.
Another very important aspect of “folk music” is that it is typically played at home, with or for friends or family, or alone. This is very different from formal concert settings and programming we in the US inherited from Europe. For me, the most important musical experience is just about always at home, where we live. To quote what Pete Seeger said in his write-up of Music Mouse in Sing Out, that “she [meaning me] foresees a day when computer pieces will be like folksongs, anonymous common property to be altered by each new user. She would like to get music out of the concert hall and back into the living room.”
Music Mouse is available for macOS and Windows 11 for $29.
-
Illinois1 week ago2026 IHSA Illinois Wrestling State Finals Schedule And Brackets – FloWrestling
-
Culture1 week agoTry This Quiz on Passionate Lines From Popular Literature
-
Science1 week agoTorrance residents call for the ban of ‘flesh-eating’ chemical used at refinery
-
Politics1 week agoWest Virginia worked with ICE — 650 arrests later, officials say Minnesota-style ‘chaos’ is a choice
-
Health6 days agoJames Van Der Beek shared colorectal cancer warning sign months before his death
-
Science1 week agoVideo: Why Mountain Lions in California Are Threatened
-
Politics5 days agoCulver City, a crime haven? Bondi’s jab falls flat with locals
-
Atlanta, GA6 days agoFulton County man arrested after SWAT standoff over alleged dog beheading