Software

Spoiled for choice: Dealing with dependencies in software development

Make-or-buy? Decisions to use frameworks or third-party software often lead to discussions about the sensible use of time and money. We show why it doesn't always make sense to rely on ready-made solutions and when in-house development is worthwhile.

Whether large frameworks or libraries or small tools or collections of useful functions: you get the impression that solutions already exist for all problems. Most of these are available free of charge as open source software (OSS). Sometimes they cost money, but these days the costs are often very low. One developer hour is significantly more expensive. Third-party software is quickly integrated thanks to Package Manager. This often pleases developers and project managers alike - when we can all save effort and money. So what's the catch?

The fallacy of costs

In most cases, almost 100% of software development costs consist of personnel costs. This creates a great incentive to save developer time. The promise of a tool to reduce developer time is therefore obvious. And the conclusion to use ready-made software to buy in parts of the development is also correct in principle, because only less development effort will ultimately save time and therefore money.

But this is also where the problem lies. How exactly do you assess whether ready-made software can save time? Cost drivers are not the initial implementation, but the time required to solve problems and the ongoing maintenance of the software. This time expenditure depends above all on whether the development team understands the software and is familiar with the code. With third-party software, this is only possible after a long period of familiarization and experience. In any case, the trade-off between using third-party software and writing your own software is not trivial, as it is very difficult to estimate the follow-up costs.

There is often a tendency to overestimate the promises at the beginning, as the decision is made on the basis of incomplete data and the immediate positive effects outweigh the positive effects (less time for familiarization and integration of third-party software in the short term). Package managers such as NuGet or npm have taken this principle to the extreme and mercilessly simplified the search for and integration of third-party software.

How do you weigh up the options?

The decision to develop software in-house or to use ready-made software is usually based on little experience. In addition, you may not know until later whether there are functional differences that cannot be recognized at first glance. They only become apparent later during implementation - a problem that cannot be completely eliminated.

So if you want to make a good decision, you need factors that allow you to deduce with little knowledge whether make or buy is the better choice. The following tricks can simplify the decision.

Scope of the solution

"It offers so much, we are certainly well positioned for the future."

This or something similar could be the argument if you only need a small part of the third-party software. But what if only the small part is needed? Isn't it easier to write the functionality yourself? Do I really need a component like leftpad?

It makes sense to take a look at the scope of the required solution. Sometimes there are no real arguments for the more complex off-the-shelf solution and you still opt for a potential that you will not exploit.

Future-proof

"External components are maintained and updated independently"

It is certainly true that components from other manufacturers often have updates and security updates that increase functionality and improve stability. The maturity of a third-party component in particular provides information about its future security. Factors include

  • Commitment, popularity (for open source software)

  • Software has a stable business model with which the manufacturer earns money

  • Enforcement in the market

These factors are easy to evaluate, for example the stars on GitHub give an indication of the popularity of the software. The market penetration or the question of the business model can often only be evaluated qualitatively.

The integration effort remains

Regardless of the previous factors, there is always a certain amount of integration work involved. In the case of in-house development, this is included. With third-party software, integration often consists of building understanding through documentation and trial and error. This structure tends to be greater for more extensive solutions. For smaller solutions, it can therefore make sense to develop them yourself in a future-proof manner, because with in-house development, the in-house development team knows the solution very well and can keep it stable.

Nothing is for free

Neither is free of charge, because being up-to-date always costs integration effort in any case, even if attempts are made to escape dependency hell (e.g. through many dependencies and dependency cascades) through automated version management in the package managers and approaches such as semantic versioning.

A practical example - Caliburn.Micro

An example from one of our software teams, who create software with the help of WPF and .NET, shows just how easy an implementation can be. They used the Caliburn.Micro framework for this. The primary reason was the use of the MVVM pattern.

The decision was easy because Caliburn.Micro is free and very easy to use and integrate. By limiting themselves to the MVVM pattern, the team had only chosen a small isolated part that was important to them. Nevertheless, problems arose relatively early on that made development unnecessarily complicated:

  • In the MVVM pattern, as implemented in Caliburn.Micro, the matching of view and view model takes place via the names of the components.

  • This pattern could not be used with the additional integration of DevExpress. Instead, the standard WPF pattern had to be used, consisting of XAML + code-behind.

  • By using two patterns, the team kept falling back into the WPF pattern because it was not clear at which point which pattern was being used.

The scope of the solution was therefore considered first. The MVVM pattern itself is a smaller isolated part. It can be easily separated out and solved by itself, without dependency on Caliburn.Micro. The integration and familiarization with the external framework was similarly complex.

The implementation of the in-house development therefore included

  • Isolating the MVVM pattern matching via filename convention (instead of component names) and

  • the integration into WPF

Our own ideas can now be implemented more easily and quickly and the code is much clearer and leaner, as only those functions that we actually need have been implemented. Our framework is therefore much easier to understand and ready for any maintenance and expansion. The use of our company's own standard formats and standard structures makes it easier to use the software within the company. This means that problems can be solved more easily using existing solutions.

The fact that it does not require a lot of programming work is shown by the self-created code, which reflects the advantages hoped for by Caliburn.Micro.

Code example

				
					/// <summary>
        /// Searches for View models and views that match the name convention and adds them to the resource dictionary
        /// </summary>
        /// <param name="typeNamespace"> The namespace in which will be searched for the view models</param>
        /// <param name="res">Reference to Dictionary in which the template should be stored</param>
        /// <param name="relTypes">Array of types that contains among other types, the view model types and view types</param>
        /// <returns>List that contains the imported templates</returns>
        public static List<String> GenerateTemplateDictionary(String typeNamespace, ResourceDictionary res, Type[] relTypes)
        {
            // get names of namespaces for VMs and Views by convention
            String vmNs = $@"{typeNamespace}.ViewModels";
            String viewNs = vmNs.Replace("Model", String.Empty);

            // getting types of VMs and Views from that namespaces
            var viewModelQuery = from t in relTypes
                          where t != null && t.IsClass && !t.IsAbstract && t.Namespace != null && t.Namespace.StartsWith(vmNs)
                          select t;

            var viewModelTypeList = viewModelQuery.Where(t => t != null).GroupBy(t => t.Name).Select(g => g.First()).ToDictionary(t => t.Name, t => t);

            var viewQuery = from t in relTypes
                            where t != null && t.IsClass && !t.IsAbstract && t.Namespace != null && t.Namespace.StartsWith(viewNs)
                            select t;
            var viewTypeList = viewQuery.Where(t => t != null).GroupBy(t => t.Name).Select(g => g.First()).ToDictionary(t => t.Name, t => t);

            var foundItems = new List<String>();

            // link the VM types to View types by convention into a datatemplate and add the datatemplate to app.resources
            foreach (var vmt in viewModelTypeList)
            {
                var viewKey = vmt.Key.Replace("Model", String.Empty);
                if (viewTypeList.ContainsKey(viewKey))
                {
                    var viewType = viewTypeList[viewKey];
                    var template = CreateTemplate(vmt.Value, viewType);
                    if (!res.Contains(template.DataTemplateKey ?? throw new InvalidOperationException()))
                    {
                        res.Add(template.DataTemplateKey ?? throw new InvalidOperationException(), template);
                        foundItems.Add($@"Found and add VM -> View: {viewType.FullName} -> {vmt.Value.FullName}");
                    }
                    else
                        foundItems.Add($@"Pair is already known VM -> View: {viewType.FullName} -> {vmt.Value.FullName}");
                }
            }

            return foundItems;
        }
        
        private static DataTemplate CreateTemplate(Type viewModelType, Type viewType)
        {
            // The only way to get around some problems with binding later is this way of creating datatemplate objects
            // see www.ikriv.com/dev/wpf/DataTemplateCreation/ 

            const String xamlTemplate = "<DataTemplate DataType=\"{{x:Type vm:{0}}}\"><v:{1} /></DataTemplate>";
            var xaml = String.Format(xamlTemplate, viewModelType.Name, viewType.Name);

            var context = new ParserContext { XamlTypeMapper = new XamlTypeMapper(new String[0]) };

            context.XamlTypeMapper.AddMappingProcessingInstruction("vm", viewModelType.Namespace ?? throw new InvalidOperationException(), viewModelType.Assembly.FullName);
            context.XamlTypeMapper.AddMappingProcessingInstruction("v", viewType.Namespace ?? throw new InvalidOperationException(), viewType.Assembly.FullName);
caliburnmicro.com
            context.XmlnsDictionary.Add("", "http://schemas.microsoft.com/winfx/2006/xaml/presentation");
            context.XmlnsDictionary.Add("x", "http://schemas.microsoft.com/winfx/2006/xaml");
            context.XmlnsDictionary.Add("vm", "vm");
            context.XmlnsDictionary.Add("v", "v");

            var template = (DataTemplate)XamlReader.Parse(xaml, context);
            return template;
        }
				
			

The use is very simple

				
					DynamicVmTemplateFinder.GenerateTemplateDictionary(typeof(App), Current.Resources,  Assembly.GetExecutingAssembly().GetTypes())
				
			

The dictionary performs the matching by binding the view and view model together.

It became clear in June 2020 at the latest that the decision was the right one. At this point, Caliburn.Micro was marked as no longer maintained by the developer.

The reasons are often the same. Few developers are responsible for the project, the developers change their private or professional setup and can no longer maintain a development that is often carried out privately.

Caliburn Screenshot

Conclusion

The example shows that decisions and considerations in this case led to the investment paying off. The freedom to make this decision in the development team characterizes the work at OHB Digital Services.

There is plenty of scope for these decisions, people can contribute their own ideas and the project manager does not dictate what type of library should be used. The fact that decisions are made as a team often results in better solutions.

Over time, it has become clear to us that it makes more sense to create our own solutions and develop a simple, small framework ourselves rather than using more complex third-party frameworks.

If you nevertheless decide to use a third-party library, you should always be aware that this is not a sure-fire success. Instead of simply integrating and forgetting, a high level of care and maintenance is also important with third-party software. If the number of dependencies increases, you can sometimes lose sight of hidden incompatibilities. Complex bug fixes can also be the result if the third-party software is not thoroughly understood. All of this must not be forgotten.

In the end, of course, the best possible product should be available for the customer and the decision should be made accordingly.

Your journey with OHB Digital Services

Use the knowledge from space travel for your business. OHB Digital Services GmbH has been a reliable partner for secure & innovative IT solutions for many years. We are part of one of the most successful space and technology companies in Europe. With our products and services, we support you in the digitalization of your business processes along the value chain and in all security-related issues. Please feel free to contact us.

Current magazine articles on the subject of software

social engineering
IT security
What actually is social engineering?
From a seemingly harmless text message to a sophisticated phishing campaign—how attackers exploit employees' weaknesses and trust to achieve their goals.
Read more
Red Teaming 32 1
IT security
What is Red Teaming and for whom is it useful?
In this article, we explain the advantages of red teaming and show you which companies this special form of pentesting is suitable for.
Read more
WEAKNESS ANALYSIS32
IT security
Why should vulnerability analysis also be an issue for SMEs?
More than half of all SMEs in Germany have already been victims of a cyberattack, with financial losses running into the millions depending on the extent of the attack.
Read more

Does that sound interesting for you and your company?

Then get in touch with us.