Researchers at Cornell University have developed an electronic chip that they describe as a "microwave brain." The simplified chip is analog rather than digital, yet can process ultrafast data and wireless communication signals simultaneously.
My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be “on” or “off,” your circuit can be really poor due to age, wear, or other factors, but if it’s within 40% of the expected “on” or “off” state, it will function basically the same as perfect. Analog computers don’t have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.
I’m really curious if the researchers address any of those considerations.
My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be “on” or “off,” your circuit can be really poor due to age, wear, or other factors, but if it’s within 40% of the expected “on” or “off” state, it will function basically the same as perfect. Analog computers don’t have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.
I’m really curious if the researchers address any of those considerations.