Improving Wetware

Because technology is never the issue

Thank You for Erlang Joe

Posted by Pete McBreen 19 May 2019 at 04:00

Recently I have been looking at Erlang and Elixir, and in the process was reading Coders at Work and came across this quote from Joe Armstrong (pg 213)

I think the lack of reusability comes in object-oriented languages, not in functional languages. Because the problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.

If you have referentially transparent code, if you have pure functions –all the data comes in its input arguments and everything goes out and laves no data behind – it’s incredibly reusable. You can just reuse it here, there and everywhere…

Something to think about.

Blockchain - is it a good idea for some domains?

Posted by Pete McBreen 15 May 2019 at 06:00

Found this set of articles on twitter…

From the history of a bad idea....

When an audience member, tiring of this foggy talk, asked if there was anything concrete that blockchains could offer the NHS, they responded that asking for practical uses of Blockchain was “like trying to predict Facebook in 1993.” The main takeaway for the health care sector people I was with was swearing never to use said accounting firm for anything whatsoever that wasn’t accounting.

Rethinking Driverless Vehicles

Posted by Pete McBreen 10 May 2019 at 13:53

in Nature is suggesting that researchers have made a wrong turn in thinking and writing about Driverless Vehicles

What these academics are not doing is asking the questions that society needs answered to decide what the role of driverless cars will be.

Ashley Nunes suggests

This leads to something many academics overlook: driverless does not mean humanless. My research on the history of technology suggests that such advances might reduce the need for human labour, but it seldom, if ever, eliminates that need entirely. Regulators in the United States and elsewhere have never signed off on the use of algorithms crucial to safety without there being some accompanying human oversight. Rather than rehashing decisions from Philosophy 101, more academics should educate themselves on the history of the technology and the regulatory realities that surround its use.