Things tagged tech:
A cinematographer makes an attempt to preform a digital vs. film comparison. In a very particular way. I think he fails to reach the goal he set out for himself, but I don’t believe that the theory he presents is incorrect. In a few years the iteration in digital imaging tech, and the math to preform the transforms will progress, and we will be there. Then this silly debate will be over :)
See also this revealing conversation he had with a film purist:
People are so religious about this that they’re resistant to even trying. But not trying does not prove it’s not possible.
If you believe there are attributes that haven’t been identified and/or properly modeled in my personal film emulation, then that means you believe those attributes exist. If they exist, they can be identified. If they don’t exist, then, well, they don’t exist and the premise is false.
It just doesn’t seem like a real option that these attributes exist but can never be identified and are effectively made out of intangible magic that can never be understood or studied.
To insist that film is pure magic and to deny the possibility of usefully modeling its properties would be like saying to Kepler in 1595 as he tried to study the motion of the planets: “Don’t waste your time, no one can ever understand the ways of God so don’t bother. You’ll never be able to make an accurately predictive mathematical model of the crazy motions of the planets — they just do whatever they do.”
Professional pilot Ron Rapp has written a fascinating article on a 2014 Gulfstream plane that crashed on takeoff. The accident was 100% human error and entirely preventable – the pilots ignored procedures and checklists and warning signs again and again. Rapp uses it as example of what systems theorists call the “normalization of deviance.”
The point is that normalization of deviance is a gradual process that leads to a situation where unacceptable practices or standards become acceptable, and flagrant violations of procedure become normal – despite that fact that everyone involved knows better.
I think this is a useful term for IT security professionals. I have long said that the fundamental problems in computer security are not about technology; instead, they’re about using technology. We have lots of technical tools at our disposal, and if technology alone could secure networks we’d all be in great shape. But, of course, it can’t. Security is fundamentally a human problem, and there are people involved in security every step of the way.
I have seen this personally many times, you can be sloppy several hundred/thousand/whatever times, and it doesn’t bite you, so you come to believe that being sloppy has no risk, and then boom out of “nowhere” a failure. When you look back and analyze the failure you will find that this complacency for the new normal of sloppy behavior is the root cause.
The behavior of the researchers is reprehensible, but the real issue is that CERT Coordination Center (CERT/CC) has lost its credibility as an honest broker. The researchers discovered this vulnerability and submitted it to CERT. Neither the researchers nor CERT disclosed this vulnerability to the Tor Project. Instead, the researchers apparently used this vulnerability to deanonymize a large number of hidden service visitors and provide the information to the FBI.
Does anyone still trust CERT to behave in the Internet’s best interests?
Another take on content centric networking, see also Van Jacobson on CCN and Named Data Networking
IPFS is a peer-to-peer distributed file system that seeks to connect all computing devices with the same system of files. In some ways, IPFS is similar to the Web, but IPFS could be seen as a single BitTorrent swarm, exchanging objects within one Git repository. In other words, IPFS provides a high throughput content-addressed block storage model, with content-addressed hyperlinks. This forms a generalized Merkle DAG, a data structure upon which one can build versioned file systems, blockchains, and even a Permanent Web. IPFS combines a distributed hashtable, an incentivized block exchange, and a self-certifying namespace. IPFS has no single point of failure, and nodes do not need to trust each other.
At first glance this is an attempt to get CCN working, in the absolutely simplest way by patching together existing tech. I respect this, and it’s not a diss to say so. Having a working stack to play with will be valuable. However I think the hackyness of this approach will prove to be too much to actually implement, and it will fall back to a complete ground up stack like the NDN folks to get something that is performant and has features like mobile nodes built in.
Matthew Brunwasser in the NYT:
“Every time I go to a new country, I buy a SIM card and activate the Internet and download the map to locate myself,” Osama Aljasem, a 32-year-old music teacher from Deir al-Zour in Syria, explained as he sat on a broken park bench in Belgrade, staring at his smartphone and plotting his next move into northern Europe.
“I would never have been able to arrive at my destination without my smartphone,” he added. “I get stressed out when the battery even starts to get low.”
Sarah Zhang in Wired:
Humans, quite simply, suck at walking in straight lines: Give a man a blindfold, and he will walk around in circles. In virtual reality, though, this failing becomes very convenient. People are so bad at knowing where they are in space that subtle visual cues can trick them into believing they’re exploring a huge area when they’ve never left the room—a process called redirected walking.
With VR technology getting ever better, people could one day explore immersive virtual spaces—like buildings or even whole cities—on foot in head-mounted displays. But it’s not very immersive if you end up smacking into a real-life wall. “This problem of how you move around when in VR is one of the big unsolved problems of the VR community,” says Evan Suma, a computer scientist at the University of Southern California.
That’s where redirected walking comes in. People don’t really notice, it turns out, if you increase or decrease the virtual distance they had to walk by 26 percent or 14 percent, respectively. Or shift the virtual room so they see their path as straight when their real path is curved. Or turn the room up to 49 percent more or 20 percent less than the rotation of their heads.
IRONSIDES is an authoritative/recursive DNS server pair that is provably invulnerable to many of the problems that plague other servers. It achieves this property through the use of formal methods in its design, in particular the language Ada and the SPARK formal methods tool set. Code validated in this way is provably exception-free, contains no data flow errors, and terminates only in the ways that its programmers explicitly say that it can. These are very desirable properties from a computer security perspective.
The Ada-Europe 2013 paper (pdf) is pretty readable.
Recently while preparing a lecture on the influence of gear on music I puzzled over the formal differences between Chicago’s house and Detroit’s techno. Both owed a lot to the restrictions inherent in Roland’s rigid TR-808 and TR-909 drum machines and the absence of a budget for much more of a set-up. There are so many commonalities that I wondered what the general differences really are. And I don’t seem to be the only one confused. Reportedly, Derrick May thought they were doing house music – until Juan Atkins insisted on the techno tag, which he in turn had borrowed from Alvin Toffler. Researching gear lists, I eventually stumbled upon a device named DX100. It was used by virtually every Detroit producer (including Derrick May: somebody said Nude Photo employs the “Wood Piano” preset) and there were periods where it was the only other sound source in the set-ups of Jeff Mills and fellow minimalist Robert Hood, aside from a TR-909 drum machine. Core lessons learned while adapting to FM have been applied to other synthesizer (and synthesis) models later, shifting the focus of programming from the keyboard-derived approaches of 1970s art rock and fusion to the synthesizer’s modulation matrix.
FM taken as a metaphor hints at the problem that in a perception-based production scenario (such as techno’s paradigmatic mode of operation) all parameters are effectively bundled and thus exist in a state of permanent dynamic cross-modulation. Each layer is modulated by all others, mirroring the others, imprinting contours onto each other. Everything is melting into a convoluted perceptual entity where the products of cross-modulation form their own emergent aesthetic layer. No doubt that the stark patterns have obvious rhythmic values, but all the time these also carry other information that wouldn’t exist without a carrier for it to be projected on. Although individual parameters (such as rhythm, pitch or compression ratio) can be described in isolation, such descriptions fail to identify the corresponding functions of their values and movements in context, i.e. their cross-modulated products are missing from the description. The exactly same movement of one parameter may “mean” totally different things in different contexts. One might also recall that outside of scores and measurements, i.e. while listening, we actually never encounter parameters in isolation (no pitch without timbre, no rhythm without duration …): there is no such thing as an independent variable.
A new game based on Steve Reich’s iconic Clapping Music lets anyone improve their rhythm.
The app on iTunes.
A good interview:
It’s Gonna Rain part II:
And this is just wonderful:
Wired with an oral history of ILM:
No one wanted Star Wars when George Lucas started shopping it to studios in the mid-1970s. It was the era of Taxi Driver and Network and Serpico; Hollywood was hot for authenticity and edgy drama, not popcorn space epics. But that was only part of the problem.
As the young director had conceived it, Star Wars was a film that literally couldn’t be made; the technology required to bring the movie’s universe to visual life simply didn’t exist. Eventually 20th Century Fox gave Lucas $25,000 to finish his screenplay—and then, after he garnered a Best Picture Oscar nomination for American Graffiti, green-lit the production of Adventures of Luke Starkiller, as Taken From the Journal of the Whills, Saga I: The Star Wars. However, the studio no longer had a special effects department, so Lucas was on his own.
Issie Lapowsky at Wired:
AltSchool is a decidedly Bay Area experiment with an educational philosophy known as student-centered learning. The approach, which many schools have adopted, holds that kids should pursue their own interests, at their own pace. To that, however, AltSchool mixes in loads of technology to manage the chaos, and tops it all off with a staff of forward-thinking teachers set free to custom-teach to each student. The result, they fervently say, is a superior educational experience.
Tim Urban with a decent, if pop-sci, look at the singularity:
What does it feel like to stand here?
Nerd out time. Gets really good towards end of part 3.
The Named Data Networking (NDN) project makes use of the CCN (Content-Centric Networking) architecture developed at the Palo Alto Research Center (PARC). In this presentation, Van Jacobson speaks on content-centric networking at the Future Internet Summer School (FISS 09) in Bremen, Germany in June 2009.
A look back at Jurassic Park, the groundbreaking decision to create digital dinosaurs, and the impact it had on the future of movies.
Wisdom from Dan Kaminsky:
First off, there’s been a subtle shift in the risk calculus around security vulnerabilities. Before, we used to say: “A flaw has been discovered. Fix it, before it’s too late.” In the case of Heartbleed, the presumption is that it’s already too late, that all information that could be extracted, has been extracted, and that pretty much everyone needs to execute emergency remediation procedures.
About two days ago, I was poking around with OpenSSL to find a way to mitigate Heartbleed. I soon discovered that in its default config, OpenSSL ships with exploit mitigation countermeasures, and when I disabled the countermeasures, OpenSSL stopped working entirely. That sounds pretty bad, but at the time I was too frustrated to go on. Last night I returned to the scene of the crime.
I love @securityhulk. This ssl mess is making for some lols.
EASY TO RECOVER FROM SSL BUG. JUST REVOKE PRIVATE KEYS, AND ANY DATA SENT THAT EVER TRAVEL OVER SSL SINCE BUG INTRODUCED. EASY PEASY.
Audio nerds: 24-bit/192kHz distribution is better right? Nope:
Monty at xiph.org:
Articles last month revealed that musician Neil Young and Apple’s Steve Jobs discussed offering digital music downloads of ‘uncompromised studio quality’. Much of the press and user commentary was particularly enthusiastic about the prospect of uncompressed 24 bit 192kHz downloads. 24⁄192 featured prominently in my own conversations with Mr. Young’s group several months ago.
Unfortunately, there is no point to distributing music in 24-bit/192kHz format. Its playback fidelity is slightly inferior to 16⁄44.1 or 16⁄48, and it takes up 6 times the space.
There are a few real problems with the audio quality and ‘experience’ of digitally distributed music today. 24⁄192 solves none of them. While everyone fixates on 24⁄192 as a magic bullet, we’re not going to see any actual improvement.
See this fantastic video for a walk through of why stair stepping is a total myth. (Yes you should still record and produce at 24-bit, due to headroom for not worrying about clipping. but mastering (a properly centered mix) to 16bit doesn’t lose anything).