The first step out of the PLM status quo!

 

The other day I talked to a colleague of mine about PLM (or whatever acronym we decide to use) and why things are moving so slow. Almost everyone we meet agrees that we must address the vast amount of manual overhead associated with the management of products. We know that our value chain needs to be automatically fuelled with product information due to the digital reality we now live in. Manual interventions, excel files, and “can you send me an email of the latest information about this product” must stop if we want to be successful in our digital journeys. We need to find a way to execute our business processes in a more flexible and efficient fashion and further leverage analytic capabilities to become more data driven. Yet we are struggling to get out of our current ways of working where only legacy tools are used for managing products and even these aren’t utilized to their full capacity.

One out of many reasons why it is so hard to change is that we do not really know what lies ahead. We do know that the future is different, and we have already identified some unknowns to be addressed, but we also know that there are plenty of unknown unknowns. There is somewhat of a catch-22 situation where the current ways of working needs to be changed, but the future is not clear, making us hesitate. Once again, we are forced to invent new things as we go, most often in isolation as add-ons to legacy tools only increasing the technology lock-in.

A common theme for successful companies is that they have a direction of where to go, together with a strategy to balance the “old” with the “new”. Both modes (old and new) are essential to create substantial value and drive the needed change, and neither mode should be static. We also see that only incremental improvements to current solutions won’t get you to your target fast enough and focusing solely on the future mode, where we need to invest heavily in new technology, is too high of a risk. A bimodal approach is buzzing and is one alternative to address this situation. This approach gives us the opportunity to manage two parallel work streams, one focusing on predictability and the other on exploration.

Mode 1 targets well-understood and proven concepts, incrementally improving legacy environments to meet some requirements of today. Mode 2 explores new ways to solve problems in areas of uncertainty. Projects in mode 2 are typically iterative where new concepts are tested and proven often adopting a Minimum Viable Product (MVP), or as we like to call it a Minimum Lovable Product (MLP), approach. Findings from mode 2 are continuously communicated and shared with people in mode 1 in order to incorporate more radical changes in a manageable pace.

This bimodal approach is slowly being adopted in the PLM domain. One issue with most legacy PLM applications and organizations is that even simple changes of functionality, business logic, data models or responsibility often result in expensive and time-consuming projects. What we need to do is to invest in areas that increases the flexibility in the PLM domain. The ongoing transformation with as-a-service offerings, higher demand for product customization and connected products is picking up pace and we will continuously miss business opportunities if not able to quickly change. In our explorative mode we need to set bold targets. The turnaround time from a new business requirement to an implemented and verified solution in operation should be days or maybe weeks, not months or years which is the norm today.

Due to these facts in particular, cloud native development is gaining traction in many companies. DevOps teams are increasingly being leveraged, utilizing the cloud native technology to build, verify and operate applications that truly exploits the advantages of cloud. We must re-organize the PLM team to ensure that the organization has the competence needed to work with new stakeholders from sales, service, analytics, and software development departments. This should be done in a way that allows us to continuously realign development priorities with the rapidly changing business priorities.

Microservices will be central. It’s a software and design architecture where the solution consists of several small and individual modules. This means that an application or tool is divided into smaller parts, each with a specific function in the application. The application is thus modular unlike a monolithic one, enabling a much higher degree of flexibility. The arrangement means that each microservice can have its own life cycle and design. DevOps team in charge of a service can select its own programming language and underlying database to implement for maximal performance. A service is disconnected from another meaning that every service can be improved and upgraded independently of other services. This makes it possible to release new features daily. It also enables scalability in a whole new way as you can scale a service independently instead of all functionality at once as in a monolithic system.

Cloud native development and technology will provide some important pieces of the PLM puzzle. By decoupling functionality in small steps, the complexity will become more manageable and pave the wave to slowly get out of our current manual ways of working without changing everything in a big bang. This journey is ongoing, and we do believe that by leveraging new technology and embracing change rather than fear it, we will eventually bridge silos and realizing the full potential of product information.

Let us know if you also want to talk about why your PLM journey aren’t moving fast enough.

 

Marcus Ohlin is a Management Consultant and Business Analyst at FiloProcess, an independent Swedish consultancy firm passionate about realizing the value of product information.