SW Engineering

We're a technology based and solution oriented company. Everything we do is about solving problems and finding the best technologies for it. Unlike other companies we're not a single technology shop. Just using one library, platform or language is like a carpenter only using a hammer and planks. These are good tools, and things works fine. But at other times you may need a saw, drill and the foundation is better suited to be concrete or stone

It is our passion to create and to do so with excellence and our investment in technologies can help you navigate. In the SW field many technologies are littered with dysfunctional cul de sacs. They solve one problem but cause many others down the line.

As a result we have an exceptionally long list of areas of expertise, and this is not textbook or lab insights. Experience requires that it is a commercially viable product. Through the years we also have a small network of extraordinary developers that we can reach out to. Knowing who to ask to check or what to look out for is really important.

What makes us a bit different is our take on development. We want to ensure that our customers products are profitable, and can remain profitable. We believe that success is derived by creating SW that is simple to produce, maintain and supports the business development. After all that is what agile is really all about, being able to year after year deliver new innovations that keeps us ahead of the competition.  

SW weaving

This is cutting edge SW development methodology. We are pioneers in applying the academic ideas of model code weaving and domain specific modeling into a usable methodology. We have applied experiences from flexible production automation and applied this to SW development. This is not the traditional always dysfunctional code generation from UML

We have a pattern based methodology that enables architectural level reuse. An architecture can't really be reused unless you build an identical house. However you soon discover solutions are reused, and that there are rules and restrictions governing them. The technique is often called domain specific modeling. Create building blocks and rules needed for that type of application.

DSM's are really powerful and useful when you repeat tasks OR when your solutions are self conformant. Looking at a successful DSM tool like MetaCase, you'll find that it is used to create "menu systems" and the likes which really are recursive and self conformant structures. 

Creating a whole new language to build one instance of a thing does not make sense. This argument states that an architecture has no reuse, it is only produced in one copy. But this is a static view, although an architectural drawing is not reused, the architect will reuse and create variations using the same principles. 

Static architecture creates a snapshot of a tree and describe the desired result in detail, the phenotype . This is like trying to specify a tree leaf by leaf, branch by branch. Usually the methods elaborates the models by adding more and more detail.  From a theoretical perspective it can be done, but it makes no sense... So we are left with a specification - implementation gap.

A dynamic architectural view look at the dynamics that creates the tree, the genotype. This takes into account how a tree grows from a sapling. This describes how and why we branch out to reach more sunlight etc. There is no real blue print of what the finished tree look like, because in the organic sense it is constantly growing and evolving.   At any given point in time we can look at it and ask ourselves if it is treelike enough or if we need to examine the architectural rules or maybe explain to the developers the essential properties that makes a tree a tree.

On detail level we find the same patters and structures as on the higher levels. Such code is easier to codify or manually apply from example. It is encoded by describing how a standard example is modified and combined with others. The relatively small genotype is required to describe an arbitrarily complex phenotype. It is infact intuitive, we can all easily picture an abstract tree or a dog, and the same logic applies to code.

Architecture that is self conformant applyies a limited set of concepts, i.e. small examples with hooks for variations and scaling. The parts are built using the same structures as the whole, no matter how close you look you find the same structures and solutions.  Object abstraction allows us to create objects by specialization. This is done the same way as every concept has an archetype, or example with code (if applicable)  

just nailing it vs a wood peg joint

More experienced programmers will feel comfortable with the pattern observation and recognize that they essentially produce code that are variations on the same themes. Carpenters and other craftsmen would also agree, there are ways of doing things that a skilled craftsman knows and masters and novices have to learn.

Architecture is reused as the parts of the architecture it self! Besides for how cool that is... this becomes an organic solution that fits well with agile. The parts and the rules are as vital as the conceptual picture of intended final product. Each instance of implementation is a specialization of a pattern, to which we can preserve a dependency.  

Object orientation fails to explain how we build working solutions out of our objects. The thousands of classes in .NET or boost doesn't explain how these are selected and combined used to solve different types of problems. There is documentation and Given the enormous redundancy in these libraries we can end up with very arbitrary solutions.

Patterns as the GOF patterns explains how generalized problems are broken down into collaborations of objects and their roles. We codifying these patters as conscepts, where the examples are the archetypes. The parametized concepts are then woven together by the developers into complete solutions. 

The pattern level reuse is revolutionary and produces significantly better SW and has exceptional quality. We have applied it manually in architectural design to produce 24-7 critical systems with outstanding performance. These systems has been designed in very short time by very small (but skilled) teams. The key to this is the use of conceptual modeling where larger patterns are used, making large parts of the code mechanical.

Self conformant code where the same patterns emerge in through out the system can share code base and be migrated to code generation initiatives as the system matures. Why? because evolving patterns can then be instantly applied throughout the system.

This organic approach to SW development suits Agile ideas very well. Mutations are promoted as new patterns and building blocks. Functions reuses old similar solutions etc. This SW is highly complex BUT its unfolding and parts are is easily explained. Like the compiler generates assembler, the conceptual code has left the applications active code base. 

The active code base can concentrated on function and features that makes the application valuable. The Structural code and interactions becomes developer neutral and is mechanically assembled, i.e. it has been deskilled. The conceptual code base can now evolve independently, just as you'd get a new compiler.

Complexity reduction using Refactoring

The patter level code reuse can be reversed. It is all about increasing reuse and reducing the solution variations

This is not an automation exercise. Manually examining the code in tools such as Lattix and its dependency structure matrix (DSM) can identify poor code, dependencies in the active code base. Essentially it is simple to write one C function, it is difficult to have it interact with others.

SoSome of this interaction may be part of the design, but the majority is actually structural. Implicit dependencies such as dimensions or un-typed variables that implies that the "parties know what they are doing" are especially harmful. Again, such solutions can be part of the codified conceptual code base.

The refactoring techniques applied will reduce complexity.  Reduced complexity will in lowers maintenance and support overhead and essentially lets us get more done with less

If you're interested to know more how to work efficiently with refactoring in agile product development don't hesitate to contact us.

Flexible Automation

Automation is about reducing labor costs by making more with less. Ford was the pioneer and industrialism preached that the large series mass-production was the way to go. When the factory workers could afford to buy the products they them selves manufactured. But the key to this has always been unskilled cheap labor. Robots and assembly lines aren't really as flexible as one could believe. 

 The only way to compete is to automate the short series production. We have experience of working with product routing through a production facility where nc programs and materials delivered just in time. Where the size of the series is as small as 1, this requires machines that can understand how to turn a drawing into a part and be open to production aspects.

These are learning machines that can generalize and make qualified guesses. This way the operators skills are encoded into the machine, increasing the level of automation. The human operator in a dialogue where they arbitrate, select, modify and establish new solutions.

This enables machine program reuse, as good solutions are adapted to similar circumstances. It is the same process used by humans when they copy files and imitate others solutions. The manufacturing process become operator neutral, which in turn creates better quality as well as a possibility to actually work actively with process quality enhancement.

Pattern recognition and machine learning though instruction combination as well as data adaptations.  This has been successfully implemented in automation. Our H* algorithm uses pattern matching to generate combination sequences to that can efficiently generate robust solutions that are "human" agreeable. In the real world solutions have to be robust, tolerant and accommodate human likes and preferences.  It is well founded theoretically sound and a proven foundation for these types of problems.  

Big Data Analytics

Our most recent addition is big data analytics. Big data is about collecting large volumes of insignificant data fragments that people and machines leave behind. The price of service and convenience today are these small slivers of information that combined can reveal behavior and even anticipate the future.

Modern machines and devices produce data used to support business decisions. The big shift is that the cost of the data feedback and analysis is much smaller now. This information incorporates these in bigger processes. It is simple economics that efficiency of a product determine its price tag. Even a mundane light bulb is no longer a mere source of light but also part of a large lifecycle process involving environment, energy and maintenance. With rising labor, environment and energy costs the efficiency is where industrialized nations have their competitive edge.

We help our customers design their systems enable them to develop their business using this information. It can be process monitoring or just dimensioning the HW. The point is that we never fully understand the details of how products are used and for what, and it is often things other than intended. An SUV was probably not designed as the soccer mum vehicle of choice. 

The data processing requires databases capable to process these volumes of data efficiently into information. The enormous scale places special requirements on how data is stored, maintained and searched. A wood processing plant can generate terabytes of telemetric data per day, a cell phone operator generates millions of samples every day.

 

For more information Contact Us