Saturday, April 27, 2013

Software Estimation

My entry on Probabilistic Programming got me to wondering about estimating productivity. Made me dig out J.P. Lewis classic paper Mathematical Limits to Software Estimation.

Abstract: Algorithmic (KCS) complexity results can be interpreted as indicating some limits to software estimation. While these limits are abstract they nevertheless contradict enthusiastic claims occasionally made by commercial software estimation advocates. Specifically, if it is accepted that algorithmic complexity is an appropriate definition of the complexity of a programming project,then claims of purely objective estimation of project complexity, development time, and programmer productivity are necessarily incorrect.

Mr. Lewis wrote a lengthy supplement, elaborating on the many misunderstands of what the paper actually said. Alas it did not really help me figure how to estimate the time it might take to use probabilistic methods. At this point, that is probably not even possible given this prenatal state of the probabilistic methodology?


Probabilistic Programming

The Defense Advanced Research Projects Agency's (DARPA) budget proposals are always an interesting read. They lead to things like this, potentially Open Source Project: Probabilistic Programming for Advancing Machine Learning (PPAML); Solicitation Number: DARPA-BAA-13-31

Machine learning is at the heart of modern approaches to artificial intelligence. The field posits that teaching computers how to learn can be significantly more effective than programming them explicitly. This idea has revolutionized what computers can do in a wide range of domains, including Intelligence, Surveillance, and Reconnaissance (ISR), Natural Language Processing (NLP), Predictive Analytics, Cyber, and various scientific disciplines. Example applications include self-driving cars, image search and activity detection, object tracking, topic models, spam filters, recommender systems, predictive databases, and gene sequencing. Unfortunately, building effective machine learning applications currently requires Herculean efforts on the part of highly trained experts in machine learning. Probabilistic Programming is a new programming paradigm for managing uncertain information. The goal of the Probabilistic Programming for Advancing Machine Learning (PPAML) program is to facilitate the construction of machine learning applications by using probabilistic programming to: (1) dramatically increase the number of people who can successfully build machine learning applications; (2) make machine learning experts radically more effective; and (3) enable new applications that are inconceivable today.

Machine learning applications work by building a model of a phenomenon of interest and then training or conditioning that model with observed data, similar to the way we may 'teach' a Neural Network today. What Probabilistic Programming is not is the classic Inference Engine, rather it is a front end to classic IEs or yet unthought of IEs. BUGS would be an example of a classic IE. The Probabilistic Programming approach separates the model from the solver, making it possible for one set of users to develop the model without having to implement or know the details of the solver. With Probabilistic Programming languages, once they exist, developers will be able to focus on developing their models while solver experts will be able to embed their expertise in reusable new style inference engines.

Where I see Probabilistic Programming being of the most use in Embedded Space is in the area of network routing for the Internet of Things (leaving aside the issues of lack of conventional radio spectrum and the raising Internet noise floor slowing everything down for the moment).

There are some Software Safety issues to address, such as how is a Probabilistic Programming verified and validated? No human may actually know how it works, to explain how a particular conclusion was reached. MISRA well known for their C and C++ safety Guidelines does have an obscure section on Autocode generation. However that is still based on conventional technology of today. What happens when true Artificial Intelligence systems do start creating programs that we rely on?

To keep up to date follow the Probabilistic-Programming.org Mailing List.


Monday, April 1, 2013

Weight of the Soul or Dust on the Scale?

Back around Halloween I mentioned my visit to the J. B. Rhine Research Center (Rhine was the first to do experiments with 'ESP' at Duke University in Durham, North Carolina in the 1920's), see Near Death at the Rhine Research Center. The first formal report of the optical work, Electromagnetic Emission From Humans During Focused Intent by William T. Joines, Stephen B. Baumann (deceased), and John G. Kruth, see Journal of Parapsychology 76(2), is now online (Membership is required). The formal report does cover Mitogenetic Radiation something I thought was lacking previously.

What also fascinates me is that if they can get the funding for this future research project: They want to try to weigh people that claim they can leave their body, sometimes refereed to as Astral Travel or Astral Projection (the original experiments invovled death, no one wants to be killing the research subjects today) , and see if there is a change in their weight.

If you believe in such things that is your personal choice, what is fascinating to me after looking into this a bit is just how hard it is to make an precise and accurate weight scale. This scale road has been traveled in the past:

[Jim Williams] worked for a few years with the Nutrition and Food Sciences Dept at MIT, building equipment for them. Once he built a scale that was so sensitive he could stand on it, take a bite out of a donut and measure the weight of the bite. He had to add a low frequency notch filter to take out the heartbeat of the user as the blood flowed up and down the femur arteries, modulating the weight on the scale. - I Remember Jim Williams, a Guru of the Analog Electronics World, as Much an Artist as Engineer.

Scott Wayne, Analog Dialogue's editor, pointed me to Jim Williams scale paper, Thank You Scott: High resolution scale: Measures up to 250 lb +/- .01 lb, detects a single bite of food, Analog Dialogue, vol. 10, no. 2, p. 17, 1976. Every issue of Analog Dialogue, from Volume 1, Number 1, 1967 through the present is available in Analog Devices' Analog Dialogue's archives.

Getting back to our current time, Vishay Precision Group (VPG) has released a Video, described here, on their Z-Foil Resistors.

In the video, a custom designed scale is used to weigh 200 grams of gold at +25C and +60C, first utilizing thin film technology for the gain resistors and then using the Z-foil resistors.

At +25C, both the thin film and Z-foil resistors measure the 200 grams of gold for a value of $11,400. With an ambient temperature increase to +60C, however, the assembly utilizing precision thin film resistors weighs the gold as 197.8 grams for a value of $11,277 - which represents a 'loss' of $123. The same assembly using Z-foil resistors is not affected by the temperature change, providing an accurate measurement of 200 grams at +60C - and no loss in measured value.

See also Why Honest Weigh Scales Are Application Specific and Application Note 5275: Calibration - Needless or a Necessity?, from Maxim Integrated.

In 1976 we could measure +/- 0.16 of ounce change in a 250 pound person. Thirty-seven years on, I wonder if Analog Devices, Intersil (see the video), Linear Technology (Jim Williams), Maxim Integrated or Vishay Precision Group (VPG), would step up to the challenge of funding the Rhine's project to showcase their current technologies, on how things have improved? Maybe you think you are up to the challenge of this scale design? Let us know.


'Magic Smoke' Resistors available off-the-shelf

Anyone that has been around electronic devices for any length of time know that when the devices fail, they tend to go up in smoke, leading to the idea that electronic parts are run by Magic Smoke. Alas our sterilized, sanitized, paranoid society is taking the *fun* out of such things as hands on learning.

While places like Analog Devices' Engineering University are great for learning theory and hands on labs with their hardware, sometimes it is far more educational, and down right *fun*, to learn why things went wrong. Only experience is going to teach one, the important debugging skill, of the differences in smells between burning resistors, burning capacitors, and burning circuit board material. Also teaches the importance of wearing protective eyewear (never know when a backwards part might try to impale itself into the ceiling or ones face if it is closer), and having a electrical rated fire extinguisher next to the workbench.


What brings us to Magic Smoke, is that Vishay has upgraded their old line of Electro-Pyrotechnic Initiator Chip Resistor (EPIC) (and Design Guide and App Notes) to the new Massive Electro-Pyrotechnic Initiator Chip Resistor (MEPIC).
MEPIC resistors, also known as bridge resistors, are resistive elements that convert electrical energy into heat energy in a precise electro-thermal profile for the purpose of initiating a series of pyrotechnic events in a controlled energetic reaction. [They go *BOOM* on command, which is different than Rapid Spontanious Self-Disassembly.]
The new Vishay Sfernice resistor is optimized for electronic igniter applications in automotive safety systems for the deployment of airbags and other safety devices; digital blasting in mining applications; and in fireworks applications for better synchronization of fireworks, music, and special effects.
With firing energy down to 1.5 mJ and a typical ohmic range of 2 Ohms (+/- 10 %), the device provides designers with very predictable, reproducible, and reliable behavior.
Offered in the standard 0805 case size for the wraparound and flip chip versions, with other sizes available upon request, the resistor features easy set-up of firing levels, and is compatible with various pyrotechnic compositions.
Offering ESD withstanding to 25 kV without extra protection, the MEPIC resistor's performance meets no fire/all fire conditions and the requirements of USCAR, AKLV16, and major car manufacturer standards. The device is RoHS-compliant and conforms to Vishay "Green" standards. [Is it not great that Fuzes are 'Green'?]
Almost lost in the mists of time is that in the past manufactures made military specific parts, before someone thought that Commercial Off The Shelf technology (COTS) was a good idea (it wasn't). National Semiconductor's, now part of TI, Application Note #761:Electronic Fuzing covers the basic terminology of Fuzing:
Fuzing mechanisms are devices used to "safe", "arm" and detonate explosive military munitions (such as missiles, mines, demolition charges, explosive shells ranging in size from 20 mm to 16 inches, unguided bombs and various submunitions). Early electronic fuzes developed for 5-inch naval air defense...
Sadly even tho I'm it good standing with my Vishay Rep. Firm, samples of (M)EPIC parts are restricted to people that can show good cause for getting them, so Homeland Security can relax. To bad, think of the fun the people at Hack A Day or Make Magazine could have with some these...as well as those great educational experiences that are being lost...

Picture of Aliens and UFO's found on military web site.

Picture of Aliens and UFO's found on military web site.