Things tagged tech:
Paul Sztorc has finally figured out how to communicate about what Drive Chains is, and here is a good hour and a half of that.
Ben Thompson at Stratechery:
I believe that Ajit Pai is right to return regulation to the same light touch under which the Internet developed and broadband grew for two decades. I am amenable to Congress passing a law specifically banning ISPs from blocking content, but believe that for everything else, including paid prioritization, we are better off taking a “wait-and-see” approach; after all, we are just as likely to “see” new products and services as we are to see startup foreclosure.
Orin Kerr presents Gus Hurwitz on The Volokh Conspiracy:
The most confounding aspect of the contemporary net neutrality discussion to me is the social meanings that the concept has taken on. These meanings are entirely detached from the substance of the debate, but have come to define popular conceptions of what net neutrality means. They are, as best I can tell, wholly unassailable, in the sense that one cannot engage with them. This is probably the most important and intellectually interesting aspect of the debate - it raises important questions about the nature of regulation and the administrative state in complex technical settings.
The most notable aspect is that net neutrality has become a social justice cause. Progressive activist groups of all stripes have come to believe that net neutrality is essential to and allied with their causes. I do not know how this happened – but it is frustrating, because net neutrality is likely adverse to many of their interests.
Tyler Cowen at MR:
Keep in mind, I’ve favored net neutrality for most of my history as a blogger. You really could change my mind back to that stance. Here is what you should do.
A film created by Carl Schlesinger and David Loeb Weiss documenting the last day of hot metal typesetting at The New York Times.
“It’s inevitable that we’re going to go into computers, all the knowledge I’ve acquired over these 26 years is all locked up in a little box called a computer, and I think most jobs are going to end up the same way. [Do you think computers are a good idea in general?] There’s no doubt about it, they are going to benefit everybody eventually.”
–Journeyman Printer, 1978
There is a bit of a backlash against UBI in the economics community at the moment, which I think is unfortunately due to a lack of awareness of potential changes to the labor market coming. Here is a paper by Daniel Susskind talking about why mainstream economics may have it wrong on the labor issues:
The economic literature that explores the consequences of technological change on the labour market tends to support an optimistic view about the threat of automation. In the more recent ‘task-based’ literature, this optimism has typically relied on the existence of firm limits to the capabilities of new systems and machines. Yet these limits have often turned out to be misplaced. In this paper, rather than try to identify new limits, I build a new model based on a very simple claim – that new technologies will be able to perform more types of tasks in the future. I call this ‘task encroachment’. To explore this process, I use a new distinction between two types of capital – ‘complementing’ capital, or ’c-capital’, and ‘substituting’ capital, or ‘s-capital’. As the quantity and productivity of s-capital increases, it erodes the set of tasks in which labour is complemented by c-capital. In a static version of the model, this process drives down relative wages and the labour share of income. In a dynamic model, as s-capital is accumulated, labour is driven out the economy and wages decline to zero. In the limit, labour is fully immiserated and ‘technological unemployment’ follows.
Personally I see huge changes coming in the near to medium term (5-40 years), such by the end of that about 40% of currently employed population will have no jobs available to them. (If minimum wage laws hold, that would be literately no jobs, if minimum wage laws fall, then it means jobs that pay only sustenance level). That will cause major changes to the economy of course. Goods will be produced at much lower cost, but how will people who don’t have jobs buy them? This is where UBI begins to make sense to me.
Then in medium to long term we have the potential of AI singularity, in which case, who knows. Even if no singularity, the automation encroachment on refuge labor will continue …
If you are intrested in hardware reverse enginereering in crypto systems, this is a very accessable talk.
A cinematographer makes an attempt to preform a digital vs. film comparison. In a very particular way. I think he fails to reach the goal he set out for himself, but I don’t believe that the theory he presents is incorrect. In a few years the iteration in digital imaging tech, and the math to preform the transforms will progress, and we will be there. Then this silly debate will be over :)
See also this revealing conversation he had with a film purist:
People are so religious about this that they’re resistant to even trying. But not trying does not prove it’s not possible.
If you believe there are attributes that haven’t been identified and/or properly modeled in my personal film emulation, then that means you believe those attributes exist. If they exist, they can be identified. If they don’t exist, then, well, they don’t exist and the premise is false.
It just doesn’t seem like a real option that these attributes exist but can never be identified and are effectively made out of intangible magic that can never be understood or studied.
To insist that film is pure magic and to deny the possibility of usefully modeling its properties would be like saying to Kepler in 1595 as he tried to study the motion of the planets: “Don’t waste your time, no one can ever understand the ways of God so don’t bother. You’ll never be able to make an accurately predictive mathematical model of the crazy motions of the planets — they just do whatever they do.”
Professional pilot Ron Rapp has written a fascinating article on a 2014 Gulfstream plane that crashed on takeoff. The accident was 100% human error and entirely preventable – the pilots ignored procedures and checklists and warning signs again and again. Rapp uses it as example of what systems theorists call the “normalization of deviance.”
The point is that normalization of deviance is a gradual process that leads to a situation where unacceptable practices or standards become acceptable, and flagrant violations of procedure become normal – despite that fact that everyone involved knows better.
I think this is a useful term for IT security professionals. I have long said that the fundamental problems in computer security are not about technology; instead, they’re about using technology. We have lots of technical tools at our disposal, and if technology alone could secure networks we’d all be in great shape. But, of course, it can’t. Security is fundamentally a human problem, and there are people involved in security every step of the way.
I have seen this personally many times, you can be sloppy several hundred/thousand/whatever times, and it doesn’t bite you, so you come to believe that being sloppy has no risk, and then boom out of “nowhere” a failure. When you look back and analyze the failure you will find that this complacency for the new normal of sloppy behavior is the root cause.
The behavior of the researchers is reprehensible, but the real issue is that CERT Coordination Center (CERT/CC) has lost its credibility as an honest broker. The researchers discovered this vulnerability and submitted it to CERT. Neither the researchers nor CERT disclosed this vulnerability to the Tor Project. Instead, the researchers apparently used this vulnerability to deanonymize a large number of hidden service visitors and provide the information to the FBI.
Does anyone still trust CERT to behave in the Internet’s best interests?
Another take on content centric networking, see also Van Jacobson on CCN and Named Data Networking
IPFS is a peer-to-peer distributed file system that seeks to connect all computing devices with the same system of files. In some ways, IPFS is similar to the Web, but IPFS could be seen as a single BitTorrent swarm, exchanging objects within one Git repository. In other words, IPFS provides a high throughput content-addressed block storage model, with content-addressed hyperlinks. This forms a generalized Merkle DAG, a data structure upon which one can build versioned file systems, blockchains, and even a Permanent Web. IPFS combines a distributed hashtable, an incentivized block exchange, and a self-certifying namespace. IPFS has no single point of failure, and nodes do not need to trust each other.
At first glance this is an attempt to get CCN working, in the absolutely simplest way by patching together existing tech. I respect this, and it’s not a diss to say so. Having a working stack to play with will be valuable. However I think the hackyness of this approach will prove to be too much to actually implement, and it will fall back to a complete ground up stack like the NDN folks to get something that is performant and has features like mobile nodes built in.
Matthew Brunwasser in the NYT:
“Every time I go to a new country, I buy a SIM card and activate the Internet and download the map to locate myself,” Osama Aljasem, a 32-year-old music teacher from Deir al-Zour in Syria, explained as he sat on a broken park bench in Belgrade, staring at his smartphone and plotting his next move into northern Europe.
“I would never have been able to arrive at my destination without my smartphone,” he added. “I get stressed out when the battery even starts to get low.”
Sarah Zhang in Wired:
Humans, quite simply, suck at walking in straight lines: Give a man a blindfold, and he will walk around in circles. In virtual reality, though, this failing becomes very convenient. People are so bad at knowing where they are in space that subtle visual cues can trick them into believing they’re exploring a huge area when they’ve never left the room—a process called redirected walking.
With VR technology getting ever better, people could one day explore immersive virtual spaces—like buildings or even whole cities—on foot in head-mounted displays. But it’s not very immersive if you end up smacking into a real-life wall. “This problem of how you move around when in VR is one of the big unsolved problems of the VR community,” says Evan Suma, a computer scientist at the University of Southern California.
That’s where redirected walking comes in. People don’t really notice, it turns out, if you increase or decrease the virtual distance they had to walk by 26 percent or 14 percent, respectively. Or shift the virtual room so they see their path as straight when their real path is curved. Or turn the room up to 49 percent more or 20 percent less than the rotation of their heads.
IRONSIDES is an authoritative/recursive DNS server pair that is provably invulnerable to many of the problems that plague other servers. It achieves this property through the use of formal methods in its design, in particular the language Ada and the SPARK formal methods tool set. Code validated in this way is provably exception-free, contains no data flow errors, and terminates only in the ways that its programmers explicitly say that it can. These are very desirable properties from a computer security perspective.
The Ada-Europe 2013 paper (pdf) is pretty readable.
Recently while preparing a lecture on the influence of gear on music I puzzled over the formal differences between Chicago’s house and Detroit’s techno. Both owed a lot to the restrictions inherent in Roland’s rigid TR-808 and TR-909 drum machines and the absence of a budget for much more of a set-up. There are so many commonalities that I wondered what the general differences really are. And I don’t seem to be the only one confused. Reportedly, Derrick May thought they were doing house music – until Juan Atkins insisted on the techno tag, which he in turn had borrowed from Alvin Toffler. Researching gear lists, I eventually stumbled upon a device named DX100. It was used by virtually every Detroit producer (including Derrick May: somebody said Nude Photo employs the “Wood Piano” preset) and there were periods where it was the only other sound source in the set-ups of Jeff Mills and fellow minimalist Robert Hood, aside from a TR-909 drum machine. Core lessons learned while adapting to FM have been applied to other synthesizer (and synthesis) models later, shifting the focus of programming from the keyboard-derived approaches of 1970s art rock and fusion to the synthesizer’s modulation matrix.
FM taken as a metaphor hints at the problem that in a perception-based production scenario (such as techno’s paradigmatic mode of operation) all parameters are effectively bundled and thus exist in a state of permanent dynamic cross-modulation. Each layer is modulated by all others, mirroring the others, imprinting contours onto each other. Everything is melting into a convoluted perceptual entity where the products of cross-modulation form their own emergent aesthetic layer. No doubt that the stark patterns have obvious rhythmic values, but all the time these also carry other information that wouldn’t exist without a carrier for it to be projected on. Although individual parameters (such as rhythm, pitch or compression ratio) can be described in isolation, such descriptions fail to identify the corresponding functions of their values and movements in context, i.e. their cross-modulated products are missing from the description. The exactly same movement of one parameter may “mean” totally different things in different contexts. One might also recall that outside of scores and measurements, i.e. while listening, we actually never encounter parameters in isolation (no pitch without timbre, no rhythm without duration …): there is no such thing as an independent variable.
A new game based on Steve Reich’s iconic Clapping Music lets anyone improve their rhythm.
The app on iTunes.
A good interview:
It’s Gonna Rain part II:
And this is just wonderful:
Wired with an oral history of ILM:
No one wanted Star Wars when George Lucas started shopping it to studios in the mid-1970s. It was the era of Taxi Driver and Network and Serpico; Hollywood was hot for authenticity and edgy drama, not popcorn space epics. But that was only part of the problem.
As the young director had conceived it, Star Wars was a film that literally couldn’t be made; the technology required to bring the movie’s universe to visual life simply didn’t exist. Eventually 20th Century Fox gave Lucas $25,000 to finish his screenplay—and then, after he garnered a Best Picture Oscar nomination for American Graffiti, green-lit the production of Adventures of Luke Starkiller, as Taken From the Journal of the Whills, Saga I: The Star Wars. However, the studio no longer had a special effects department, so Lucas was on his own.
Issie Lapowsky at Wired:
AltSchool is a decidedly Bay Area experiment with an educational philosophy known as student-centered learning. The approach, which many schools have adopted, holds that kids should pursue their own interests, at their own pace. To that, however, AltSchool mixes in loads of technology to manage the chaos, and tops it all off with a staff of forward-thinking teachers set free to custom-teach to each student. The result, they fervently say, is a superior educational experience.