January 30, 2015

Building standalone Simulink models with calls to external libraries

Matlab ​Simulink can compile and build simulation models into stadalone executables that do not require the presense of any runtime binaries from Matlab. However there are a few limitations, that I have had to overcome this week:

  1. There can be no algebraic loops in the model.
  2. Certain Matlab functions are not available.

Eliminating algebraic loops in the model is sometimes possible by reformulating the problem. This is also a good idea in any case, as the convergence of the algebraic loop solver of Simulink is not in any way perfect, possibly resulting in a termination of the simulation.

Below is presented a way to overcome the second restriction in relation to functions that are used for calling external libraries.

Matlab provides a few functions for calling functions in a dynamically loaded library file. (.dll in Windows, .so in Linux):

  • loadlibrary(libname, hfile)
    Loads a DLL file into Matlab.
    • libname is the name of the library, wihtout the file name suffix (.dll/.so)
    • hfile is the name of a C-language header file that declares the functions provided by the DLL.
  • calllib(libname,funcname,arg1,...,argN)
    Calls a function provided by the DLL.
    • loadlibrary must be used before calling this function.
  • libpointer(datatype, value)
    Creates a pointer object that can be used for reading data that is written by a library function into a buffer provided as an argument.
These functions are very convenient, because they can automatically transform the Matlab data types into the correct forms expected by the library function. However, they are not available for use in a compiled Simulink model. When executed within Matlab, the Simulink model is always compiled into an S-function. To be able to access loadlibrary, callib and libpointer, they must be declared using coder.extrinsic(), which provides access to these functions.

However, when a Simulink model is compiled into a completely standalone executable, these functions are not available even with the coder.extrinsic() declaration. If the model contains a user-defined block, such as "MATLAB function", "Level-2 MATLAB S-Function" or "MATLAB System", which contains calls to library functions using the above-mentioned functions, building a standalone executable will fail with an error message: "The extrinsic function 'libpointer' is not available for standalone code generation. It must be eliminated for stand-alone code to be generated."

However, there is an alternative way of calling external library functions from compiled Matlab code:

  • coder.ceval('FCN',X1,...XN)
    This is a function that can only be used in MATLAB code that is compiled into C code. It is a kind of macro that simply places a call into the generated C code without making any checks on its arguments.
  • coder.cinclude(headerfile)
    Causes the addition of an #include statement in the generated C code.

Unfortunately the use of calllib and coder.ceval is mutually exclusive; calllib can only be used in non-compiled models and coder.ceval can only be used in compiled models. Fortunately this can be checked at "run-time" in the Matlab code, by checking the value of the variable coder.target. When code is compiled into an S-function within Matlab, this variable has the value 'sfun'. When the code is compiled into a standalone model executable, this variable has the value 'standalone'. This can be checked in the Matlab code using ordinary if statements, so it is possible to write custom blocks that work both in Matlab and as standalone executables.

  use_coder = ~strcmp(coder.target, 'sfun');

  if use_coder
     coder.ceval('myfunction', arg1, arg2, arg3);
     calllib('library', 'myfunction', arg1, arg2, arg3);

Unfortunately it is not so easy as this. To be able to use coder.ceval(), the exectable must be linked against the DLL using a .lib file, forming a fixed dependency between the executable and the DLL(s). This unfortunately prevents calls to identically named functions in several different DLLs.

A bigger difference between coder.ceval() and calllib() is that coder.ceval() makes absolutely not checks or conversions for the datatypes of the arguments. All arguments must already be of a suitable data type, and any variables that are to be passed using a pointer (or reference in C++) must be explicitly declared using coder.ref(). Otherwise the result will most probably be a hard crash of the standalone model.

Lets take as an example the following C language delaration for a function that returns the average of an array of floats and writes the standard deviation into a pointer value.

  double stats(float *buf, double *sd_out, int bufsize);

While completely okay when used with callib(), the following call has a number of problems.

  n = 10;
  values = rand(1, n);
  sd = 0.0;

  // This will crash!!
  mean = coder.ceval('stats', values, sd, n);

The vector 'values' would be passed into the function as double pointer instead of float, as required. 'sd' is passed by value, not as a pointer. 'n' would be passed as a double instead of an int.
Finally, without a prior declaration of 'mean', Matlab Coder has no way to infer the data type of the return value, resulting in an error during build.

Below is a corrected version of the call:

  n = int32(10);
  values = single(rand(1, n));
  sd = 0.0;
  mean = 0.0;
  mean = coder.ceval('stats', coder.rref(values), coder.ref(sd), n);

To catch more potential typing errors, a call to coder.cinclude('libraryheader.h') should be located at some point, so that a declaration for the library function is added to the compiled C code.

It is quite natural for libraries to require calls to some kind of initialization functions, and a very natural place to locate these would be in an initialization function for the block that uses the library. Unfortunately, when building a standalone executable, these calls will be made before the Simulink Coder starts to build the executable, and the initialization calls will be completely omitted from the standalone model, resulting in possibly mysterious errors.

A possible solution is to move any such calls from the initialization functions into a conditional clause at the dependent code, which is only executed at the first call. An easy way to achieve it is to use a local persistent variable like this:

  function y = fcn(xin, par1, par2)
    persistent initialized;
    if isempty(initialized)
      initialized = true;
      coder.ceval('myinitializationcall', par1, par2);

To gain access to necessary parameter values for the relocated initialization call, some additional parameters may need to be added to the code block. Below is an example of how to do this using the Model Explorer.

Let's say that we have a Matlab function block like this:

The block has a mask that defines a single integer input (Parameter1) that is used as an argument in an initialization call to a custom library:

The function itself is also implemented as a simple call to the same library:

  function y = fcn(u)
    y = callib('mylibrary', 'myfun', u);

The coder.extrisic() call is necessary here, because the function is compiled into a MEX function by Simulink, even when the model is simulated in Matlab. The initialization call on the other hand is made by the Matlab interpreter, in which calllib() is always available.

Now if we want to be able to use the block in a standalone Simulink model executable, we need to replace the initialization calls in the mask with calls that are made in the function itself. For this purpose, we can add an additional argument for the function, which provides the value of the mask parameter. For this, we need to open the Model Explorer and select the MATLAB Function block (Right-click/Explore will do). With the "Add Data" command, we can insert the necessary parameter into the function. Note, that the scope of the added data item must be set to "Parameter", instead of the default scope of "Input".

Then we can add the equivalent call to coder.ceval() into the function and also add a conditional code block for the initialization call, like this:

  function y = fcn(u, Parameter1)
    persistent init_called;

    if strcmp(coder.target, 'sfun')
      y = callib('mylibrary', 'myfun', u);
      if isempty(init_called)
        init_called = true;
        param = int32(Parameter1);
        coder.ceval('myinit', param);
      % make sure of the right types
      y = 0.0;
      tmp = double(u);
      y = coder.ceval('myfun', tmp);

To be able to build the model, we also need to add 'mylibrary.lib' as an external library in the code generation configuration for the whole model (Code Generation/Custom Code/Libraries). After doing this, we are ready to build the model as blazingly fast and completely standalone executable with no additional runtime library dependencies.

July 13, 2012

Drone Warfare, Blowback and PTSD

In The Real Blowback Fallacy at antiwar.com, John Poindexter has good counterarguments to Christopher Swift in foreignaffairs.com.
It means that if five members of his group agreed that drone strikes aid in recruiting AQAP members, then roughly 3,000,000 other Yemenis must also support that conclusion.
He is spot on contrasting the local actions of AQAP with those of the US.
It certainly shouldn’t be the role of the U.S. to police Yemen, but if the money is going to be spent, I would rather see it go to feeding some poor family or digging wells in the desert than burning the flesh off infants or dismembering whole families at random.
Despite all the talk about precision strikes, the fact is that since Vietnam, force protection has been allowed to completely dominate over the need to avoid collateral damage. Even "precision" use of air force will always cause much more collateral damage than would be caused by ground forces.

As it happens, drone warfare can be even more damaging to psyche of the operator than participation in ordinary warfare. Going to work to kill people during the day, perhaps just kilometers away from one's home, and returning home every evening to spend time with the family, makes the killing a normal part of everyday life, causing a far deeper impact on one's conscience than acts that are made in the special circumstances of ordinary warfare. P.W. Singer explains it well in this video.

This problem has been well recognized by the people involved, but the solution proposed by David Axe a few weeks ago is definitely not the right solution.
A more independent drone could alert its controller for assistance only when it has spotted a likely target. The operator would give a thumbs-up or thumbs-down for the robot to fire a weapon. With only minimal involvement, the human being could avoid feeling fully responsible for the consequences of the strike. Drones are already becoming more autonomous by the day, opening the door for a different emotional dynamic between them and their operators.
Besides being flagrantly immoral, this would just heavily increase the number of collateral casualties and cause even bigger blowback.

April 26, 2010

About My Master's Thesis

This post might well be the first that has any connection at all to my everyday life.

I haven't written anything here for a very long time. I have been busy enough with other things. On my spare time I have been finishing my long overdue master's thesis at the Helsinki University of Technology, which is nowadays a part of the Aalto University in Helsinki, Finland.

My thesis concerns digital texturing of solid objects, and in it I describe how texture mapping, as it is understood in computer graphics, can be used for designing objects with custom low-level surface details. The work is based on an old project in which we produced highly accurate laser-machined details into the surfaces of plastic injection molds, based on ordinary bump map images and 3D models of the mold cavities. The point of my thesis is that digital texture mapping is a viable tool for the design and manufacturing of (embossed) surface details, as long as the right tools are made available.

As one part of the work I developed a new method for adaptive displacement mapping of triangular meshes that is specifically aimed at well-specified tolerances at an optimal number of output primitives. It can be readily used to produce textured geometries for rapid prototyping or laser machining.

It's applicability is not necessarily limited to manufacturing, but could be used for visualization as well. At least it easily beats the adaptive displacement in 3dsMax.

Here is a flat shaded rendering of a sample model from the algorithm, along with the tileable displacement texture. (It's an old concept design, not any actual phone model.)

If anyone is interested in asking more, please feel free to email me at vtt.fi (firstname.lastname).

November 29, 2009

Speculation vs. Actual Investments - The Carrot and the Stick

Paul Krugman takes a stand in support of taxing financial transactions. He mentions activities in the financial sector that Adair Turner referred to as "socially useless", meaning especially speculation. I'm all any means of reducing excessive speculation, if a workable solution can be found.

On the other hand, there could be more talk about encouraging more socially useful activities. Investment must be directed away from speculative secondary markets and into primary markets that create new real assets.

There was a failed attempt at getting tax relief for "angel investors" in Finland. The idea was quite sensibly dropped, because an "angel investor" is not a concept that anybody is able to define in a watertight manner in a legal sense.

So I thought: Why not give a tax relief to all capital gains income from first resales of stock that was bought in an issue. Thus, the original holder of any newly issued equity would have a favorable treatment. This would both create an incentive for participating in stock issues and a disincentive for giving up the original ownership.

This would encourage people to invest in newly started or growing businesses. It would also encourage companies to acquire new capital, thus acting as a disincentive for excessive leverage.

As a bonus, there would be no need to define any ambiguous terms like "angel investor".

October 14, 2009

John Hussman - Zen and the Art of Market Analysis

Fund manager John Hussman makes some interesting, if a bit gimmicky, connections between the zen Buddhist teachings of Thich Nhat Hanh and financial decision making in his latest newsletter:
The best way of preparing for the future is to take good care of the present, because we know that if the present is made up of the past, then the future will be made up of the present. All we need to be responsible for is the present moment. Only the present is within our reach. To care for the present is to care for the future.

Thich Nhat Hanh

This week's comment is dedicated to my dear friend Thich Nhat Hanh, a Vietnamese Buddhist monk who was born on October 11, 1926, having been born previously in January of that same year, and twice again about 25 years earlier, not to mention countless other times through his ancestors, teachers, and other non-Thich Nhat Hanh elements. Thay (the Vietnamese word for “teacher”) would simplify this by saying that today is his eighty-third “continuation day,” because to say it is his birthday is not very accurate.


So here's another koan – “If a share of stock is sold in a forest, and nobody is around to buy it, does it still generate a fill?”

The immediate implication of interbeing is that we are forced to think about “general equilibrium” rather than imagining that one side of a trade can exist without the other. This immediately clarifies all sorts of misconceptions that we could fall victim to if we aren't careful.

For example, it immediately tells us that “cash on the sidelines” is not a useful concept, except as a measure of issuance. See, whatever “cash” is there on the sidelines exists because government has created paper money, or the Treasury has issued bills, or because companies have issued commercial paper. Until those securities are actually physically retired, they will and must remain “on the sidelines” because somebody will have to hold them.

If Mickey wants to sell his money market fund to buy stocks, the money market fund has to sell commercial paper to Nicky, whose cash goes to Mickey, who uses it to buy stocks from Ricky. In the end, the commercial paper Mickey used to have is now held by Nicky. The cash that Nicky used to have is now held by Ricky, and the stock that Ricky used to have is now held by Mickey. There is exactly the same amount of “cash on the sidelines” after this transaction as there was before it.


Here's another koan:

A novice monk approaches his teacher and asks “What is the price movement of one share being bought?”

The teacher holds out a cypress leaf in his palm and asks, “Did I catch the leaf as it fell from the tree, or did I raise it from the ground?”

We are used to thinking that the act of buying necessarily implies rising prices. But think about this for a second. In either case, the teacher gets the cypress leaf. What makes the difference so far as direction is concerned is where the pressure is coming from. If the cypress leaf is being offered down by gravity, it is caught on a decline. If the leaf is being lifted by the teacher, it is caught on an advance. Remember that. It is easy to get trapped in wrong thinking by people who talk about “cash on the sidelines” or talk about “investors” buying or selling in aggregate.

There was no excess of stock that was “sold” in March that has to be “bought” back now. Investors didn't “get out” of the market last year, and we shouldn't think that they have to “come into” the market now. Every share that was sold was bought. That has been true for every minute of every trading day since the beginning of the financial markets.
These insights, though perfectly understandable without esoteric zen Buddhist thinking, are spot on. There is no such thing as "cash on the sidelines".

Instead, one could try to evaluate the amount of potentially untapped credit. Reductions in credit lines all around have been taking this measure down lately. Both businesses and households have been shrinking their debt loads in the US somewhat. Unfortunately unemployment, lack of demand and shrinking collateral are eating the other side of the equation.

Thich Nhat Hanh writes well about zen in terms that are easily understandable by the average layperson. I liked his book "Peace is Every Step". He has lots of interesting advice for practicing zen in everyday situations, like driving a car or washing dishes. Unfortunately I have great problems in adhering to the principle of mindfulness. Absent-mindedness is my second nature. (Or is it the first?)

Financial Analyst Bias and Client Pressure

Bloomberg has published an interesting article by Edward Robinson about the influence that sales representatives of investment banks have had on the financial analysts working at the companies (hat tip to C K Michaelson of Some Assembly Required):
When Credit Suisse Group analyst Ivy Zelman refused to turn bullish on homebuilding stocks during a rally in the fourth quarter of 2006, the blowback was intense.

She says investors told her that some housing industry executives were ridiculing her analysis as a “jihad,” and several of the bank’s sales representatives pressed her to upgrade “hold” ratings to “buys” on companies to appease bullish institutional-investor clients. One sales manager even sent her an e-mail warning that analysts who stayed bearish too long often lost their jobs.
The article goes on to explain how very biased analysts in general are against handing out "sell" recommendations for stocks.

The pressure that is applied on analysts that make "sell" recommendations has been documented before. Start analyst Meredith Whitney even got death threats for giving out a pessimistic but accurate analysis of Citigroup in late 2007.

The reference to "appeasing insitutional-investor clients" is a dead giveaway of the cause of this pressure. These clients are not after sound advice. They have already made their minds. What they are after is a blanket of security in the form of advice that just happens to agree with their prior decisions.

If things go wrong, these institutional investors (read retirement fund managers) can just explain that they relied on the investment bank analyst's analysis. This partly relieves them from the responsibility for the bad investments.

Maybe it would not be a bad idea to prevent companies that handle client money or have proprietary trading desks from publishing any analysis or advice at all. At least fund managers shouldn't be able to hide behind advice from parties that are completely dependent on them as clients.

October 5, 2009

Bubbles, Unemployment and Fiscal Stimulus

Paul Krugman, is trying to fight for the usefulness of fiscal stimulus in mitigating unemployment.
Ryan Avent has some fairly harsh words for Arnold Kling’s recalculation theory of business cycles. Tyler Cowen, predictably, thinks Ryan is too snide.

But what none of the participants in the debate seem to realize is that Arnold is basically reinventing 1934 macroeconomics.


It’s all there: mass unemployment is necessary, because you have to shift resources away from sectors that got too big, stimulus is a bad thing because it slows the necessary adjustment. And now as then, the whole notion falls apart when you ask why, say, a housing boom — which requires shifting resources into housing — doesn’t produce the same kind of unemployment as a housing bust that shifts resources out of housing.
Both booms and busts involve shifting of resources between sectors. This much is self evident. It would be stupid to try to deny that. This doesn't mean that the unemployment should not be alleviated at all.

Krugman unfortunately makes a complete fool of himself in the last paragraph. Of course booms don't involve unemployment. In a boom, the driving force of inter-sectoral adjustment is a surge in demand. In a bust, the cause is a collapse of demand.

In a boom, the workforce moves behind the pull of opportunity. This was clearly visible in the flow of newly minted real estate brokers who were encouraged to leave their prior jobs in search of bubble-induced income. In a bust, the realignment is caused by a push-type phenomenon. (Or is it a kick?)

This is such an elementary thing that I can only wonder what Mr. Krugman has been smoking while writing that last paragraph.

To clarify, my own view is that increasing unemployment can hardly be completely avoided in a major bust of a bubble, but that government projects can (and should) be brought forward to mitigate it, if only to make use of the suddenly cheap labor and raw materials. Busts clearly have effects on people who were not directly involved in the bubble and an uncontrolled collapse is bad for everybody.

Equally, I believe that fiscal measures might even be appropriate in controlling the growth of a bubble. Additional small taxes on house-flippers (say for houses bought and sold within 12 months) could have helped slow down the development of the housing bubble. Additionally, it could have provided a larger cushion for the inevitable fiscal collapse.

In both ends of the boom-bust cycle, monetary policy should be the primary means ahead of fiscal measures. In this sense, the QE policies should go even further, as there is a lot of credit that is (and should be) paid down.

This collapsing credit can be replaced with base money even if the money multiplier has collapsed to unity. Instead of "mopping up the liquiduidity", central banks should bring back meaningful cash reserve requirements (to all bank-like entities) after the economy has recovered from the excess of credit. This would naturally will be fought against tooth and nail by the financial sector.