Model Oriented Domain Analysis and Engineering (MODA + MODE) offers a systematic approach for conducting commonality and variability analysis across the needs of all customers and prospects of a product line, and for sourcing and surfacing the domain knowledge needed to design intuitive configuration tools that can be operated by customers.
The following MODA + MODE guidelines for product line management and new product line design and development are available from S23M.
Identification of Product Line Stakeholders and Product Line Scope
Date: 2015-01-30
Conclusion: A product line engineering approach to digital service development and operation can unlock significant value if due diligence is applied when identifying product line stakeholders and product line scope. A successful product line is one that enables all stakeholders to apply their unique expertise within the context of the product line at exactly those points in time when their knowledge and insights are required as part of the organisational decision making process. Good product line architectures align human expertise, organisational structure, business processes, software system capabilities, and value chain integration1 with customers and suppliers.
Essential Digital Product Line Design Goals
Date: 2014-12-30
Conclusion: The digitisation of service delivery in the finance, insurance, and government sectors means that all organisations in these sectors are now in the business of developing, maintaining, and operating software products for millions of users, with profound implications for organisational structures1, business architectures2, and the approach3 to service development and operation. Whilst internal business support functions can usually be addressed via off-the-shelf software, with very few exceptions, the functionality of customer facing services can’t be sourced off-the-shelf.
In the Ocean of Big Data, Machines represent the top of the Digital Food Chain
Date: 2014-11-30
The popularity and growth of online social media platforms has pushed social data into the spotlight. Humans using the Web mainly interact with human-produced data. Yet the floods of machine-generated data that flow through the Internet remain invisible to humans. For a number of reasons attempts by organisations to mine big social data to improve marketing and to increase sales will fall significantly short of expectations. Data from digital devices and sensor networks that are part of the Internet of Things is eclipsing human produced data. Machines have replaced humans as the most social species on the planet, and this must inform the approach to data science and the development of healthy economic ecosystems.
Risk management and quality assurance of large enterprise Cloud service rollouts
Date: 2014-10-26
When implementing enterprise Cloud services, a disciplined and locally distributed approach to user acceptance testing in combination with real-time dashboards for test management and defect management can be used as the centrepiece of a highly scalable quality assurance framework. An effective quality assurance process can go a long way to minimise risks, and to ensure a timely and successful rollout.
How best to articulate the rules of business
Date: 2014-09-24
The development of new digital services often entails not only changes to workflows but also changes to the business rules that must be enforced by software. Whilst vendors of business rule engine technology often market their products as powerful and highly generic tools, the best results are achieved when restricting the use of the different approaches to specific use cases.
The Lingua Franca of Government is not SAP, it’s social
Date: 2014-08-31
In government organisations the potential for standardisation and process automation via the use of enterprise resource planning software is largely limited to internal administration. In terms of digital service development government organisations can optimise their IT budgets by understanding themselves as knowledge-transformation organisations rather than as consumers of off-the-shelf technology.
Last Word: Do you have a healthy relationship with your Software?
Date: 2014-08-21
Once upon a time there was a programmer who developed software, working for a software vendor, and there was a CEO, a CIO, and a sales executive who all worked for a manufacturing business. It was a happy time, where everyone knew who developed software, who bought software, who implemented software, and who used software. In this long-gone era businesses delivered physical goods and professional services, and software was a helpful tool to standardise business processes and automate tedious repetitive tasks. Those were the days where hardware was solid, software was easy to deal with (certainly not harder than dealing with a teenager) and humans were the masters of the universe….
The Lingua Franca of Business is SAP – not English
Date: 2014-07-29
The operational model and associated processes of larger organisations in many sectors of the economy are encoded in software. Enterprise software from SAP plays a dominant role in many industries and significantly influences the terminologies and workflows used within organisations, in particular in those domains where SAP offers out-of-the-box solutions. The resulting level of standardisation has tangible advantages, but also represents an upper limit to the level of operational efficiency that is achievable. Organisations that rely on SAP are well advised to get independent advice to determine the optimal level of lock-in to SAP.
A Culture of minimal Knowledge sharing Costs more than Money
Date: 2014-06-24
Organisations that fail to recognise the difference between information and knowledge are at risk of haemorrhaging knowledge at a rate that at the very least has a measurable impact on the quality of service delivered by the organisation. In the worst case, a loss of knowledge poses an existential threat to a product line or to the entire organisation. Whilst tools can play an important role in facilitating knowledge preservation, it is information sharing between individuals and teams that fuels the creation of knowledge.
User Experience Design: More than meets the eye
Date: 2014-05-27
Consumer-oriented software and online services are raising user expectations. To determine the aspects of user experience design, and the trade-offs that are appropriate in a particular business context, requires extensive collaboration across multiple disciplines. The cross-disciplinary nature of the work must be considered when evaluating external providers of user experience design services. References and case studies should be consulted to confirm cross disciplinary capabilities and the level of expertise in all relevant disciplines.
Architectures for Mobilised Enterprise Applications
Date: 2014-04-26
Enterprise software vendors and enterprise software users are increasingly investing in in functionality that is accessible from mobile devices, and many organisations face the challenge of making key legacy applications accessible on mobile devices. Comprehensive and reliable APIs are the key for the creation of architectures that enable a seamless user experience across a range of mobile devices, and across a backend mix of state of the art Cloud services and legacy systems.
Service resilience – A defining factor of the user experience
Date: 2014-03-31
The digitisation of services that used to be delivered manually puts the spotlight on user experience as human interactions are replaced with human to software interactions. Organisations that are intending to transition to digital service delivery must consider all the implications from a customer’s perspective. The larger the number of customers, the more preparation is required, and the higher the demands in terms of resilience and scalability of service delivery. Organisations that do not think beyond the business-as-usual scenario of service delivery may find that customer satisfaction ratings can plummet rapidly.
An SOA maturity assessment is the first step towards an integrated platform for delivering innovative digital services
Date: 2014-02-26
Technology increasingly is a commodity that can be sourced externally. In contrast, trustworthy data has become a highly prized asset. Data storage can be outsourced, and even SOA (Service Oriented Architecture) technology can be sourced from the Cloud, but the patterns of data flow in a service-oriented architecture represent the unique digital DNA of an organisation – these patterns and the associated data structures represent the platform for the development of innovative digital services.
To improve the Business Intelligence of your Organisation you need to learn the Language of Data
Date: 2014-01-27
Machines are becoming increasingly proficient at tasks that, in the past, required human intelligence. Virtually all human domain expertise can be encoded in digital data with the right knowledge engineering tools. The bottleneck in understanding between humans and software is shaped by the ambiguities inherent in human communication, not by the challenge of developing machine intelligence. To benefit from big data, organisations need to articulate knowledge in the language of data, i.e. in a format that is not only understandable by humans but also actionable by machines.
Business risk management and proactive crisis management
Date: 2013-12-30
As physical and digital supply chains become more integrated across organisational, regional and national boundaries the potential impact of an emergency or crisis can be far reaching. A proactive approach to crisis management requires an awareness of all the high-impact crisis and emergency events that could affect an organisation and requires appropriate tools for risk assessment and active hazard management.
Do you know what I mean?
Date: 2013-12-21
Over the last few years the talk about search engine optimisation has given way to hype about semantic search….
Next generation crisis management – emergency response systems
Date: 2013-11-26
Over the last five years the market of crisis response systems has undergone a rapid evolution. Innovative solutions exploit the proliferation of smart mobile devices, the continuously growing number of available data feeds, the simplicity of the deployment models afforded by the Web and powerful geographic information system functionality. Given the maturity of some of the available solutions, it makes sense for larger organisations in the public sector and for utility organisations to consider the deployment of a modern crisis management and incident response system.
Measuring the performance of an Enterprise Architecture team
Date: 2013-10-28
Difficulty in defining performance criteria for an enterprise architecture team typically points to a lack of clearly articulated business priorities, or a lack of a meaningful baseline against which performance can be assessed. An enterprise architecture team needs to be given clear objectives that relate to the performance of the business without being prescriptive in terms of the target IT system landscape.
Business architectures for the Digital Service Economy
Date: 2013-09-23
Today, nearly all organisations are delivering services to customers and suppliers. Quality of service expectations of external stakeholders create significant challenges for organisations that were used to treating IT and software needs as internal topics that are at least one level removed from customers and suppliers. Digital services have evolved into the key mechanism for embedding an organisation into the external value chain. Articulating a clear conceptual picture of the external value chain in precise terms without using any IT jargon is a prerequisite for innovation and successful business transformation, having an IT strategy is no longer good enough.
The organisational pattern for digital service delivery
Date: 2013-08-30
Every business now operates in a context that includes the use of digital services. While the IT strategies of many organisations articulate a business case for technological innovation, they offer little guidance in terms of organisational patterns that enable and facilitate the delivery of useful and reliable digital services. Organisational structures must be adapted to meet the needs of the new world of digital service networks.
Construction of secure and resilient Cloud services
Date: 2013-07-29
In a few years from now the Cloud services we use today will look as quaint as the highly static Web of 1997 in the rear view mirror. In the wake of the global financial crisis the hype around big data is still on the increase and big data is perceived as the new oil of the economic engine. At the same time, many of the data management technologies underlying Cloud services are being consolidated, creating new kinds of risks that can only be addressed by the adoption of a different data architecture.
Last Word: the new science of data
Date: 2013-07-22
Data scientists are in hot demand. In December 2012 the Harvard Business Review featured an article titled “Data Scientist:The Sexiest Job of the 21st Century”. International online job boards and LinkedIn have many thousands of openings asking for big data skills and a growing number of openings for data scientists. What is all the hype about?
Integration of Big Data Cloud Formations – Cyclone Alert
Date: 2013-06-28
Cloud infrastructure and platforms have started to alter the landscape of data storage and data processing. Software as a Service (SaaS) Customer Relationship Management functionality such as Salesforce.com is considered best of breed, and even traditional vendors such as SAP are transitioning customers to SaaS solutions. The recent disclosure of global Cloud data mining by the National Security Agency (NSA) has further fuelled concerns about industrial espionage in Europe and has significantly raised citizen awareness with respect to privacy and data custodianship. Any realistic attempt to address these concerns requires radical changes in data architectures and legislation.
ERP in Australia 2013 – the system integration challenge
Date: 2013-05-28
Most organisations that use enterprise resource planning (ERP) software have a need to integrate the ERP system with other enterprise software. It is common for ERP systems to be integrated with customer relationship management software (CRM) and with all the bespoke applications that operate at the core of the business. Some organisations strive to simplify the system integration challenge with a single silver-bullet system integration technology, but this approach only works in the simplest scenarios, when the number of system interfaces is small. Instead, aiming for maintainable integration code leads to better results.
ERP Trends in Australia 2013: Standardisation or Best-of-Breed?
Date: 2013-04-29
Many organisations in Australia rely on SAP software for enterprise resource planning (ERP) software. To get the best results out of their data, a significant number of organisations have implemented a data warehouse alongside operational systems, and are combining SAP software with best-of-breed technologies for customer relationship management and system integration. Whilst SAP software continues to provide important functionality, it pays to understand to what extent standardisation of ERP functionality makes economic sense, and from what point onwards standardisation reduces the organisation’s ability to deliver unique and valuable services. Standardisation is desirable only if it leads to a system landscape that is simpler and sufficiently resilient.
No Big Data without open public sector information
Date: 2013-03-28
Government agencies are slow in implementing open public sector information in line with freedom of information requirements. Agencies are challenged in terms of awareness of related government policies, in terms of cross-disciplinary collaboration, and in terms of obtaining funding for open data initiatives. The implications are not limited to government, but also affect the ability of Australian businesses to develop innovative products that derive value from Big Data in the public domain.
Resilience or efficiency? Increasing one invariably reduces the other
Date: 2013-02-26
Today organisations need to adapt swiftly to changes in their external environment. Brittleness and inflexibility are characteristic of complex systems that lack modularity and redundancy. Resilient systems offer an appropriate level of redundancy at all levels of abstraction: from replicated skill sets within organisational structures to physical redundancy of hardware. In other words, a simplistic focus on efficiency may introduce more risks than benefits.
Service Virtualisation – A key success factor for DevOps and SOA
Date: 2013-01-30
The concept of service virtualisation is fundamental to the development of scalable service oriented architectures (SOA) and to the implementation of a DevOps approach to software change and operations. On the one hand service virtualisation enables the development of resilient high-availability systems, by enabling dynamic switching between different service instances that may be running on completely independent infrastructures. On the other hand, service virtualisation enables realistic integration tests of non-trivial Web service supply chains.
White Paper: Unlocking the Value of Big and Small Data – Challenges, Opportunities & Strategies
Date: 2013-01-19
The topic of Big Data has been propelled from the engine room of theWeb 2.0 giants into the mainstream press. Over the last decade, the volume of data that governments and financial institutions collect from citizens has been eclipsed by the data produced by individuals in terms of photos, videos, messages, as well as geolocation data on online social platforms and mobile phones, and also the data produced by large scale networks of sensors that monitor traffic,weather, and industrial systems….
DevOps – achieving resilience and agility in the Cloud
Date: 2012-12-25
DevOps is a grassroots movement that is only a few years old but has quickly spread across the globe, and its influence is present in virtually all organisations that operate popular Cloud services. DevOps is a portmanteau of software system Development and Operations, referring to the desire to bridge the gap between development and operations that is inspired by agile techniques, and that is driven by the need to continuously operate and upgrade Cloud services. The DevOps movement is having a profound impact in terms of the tools and techniques that are used in the engine rooms of Clouds, leading to order of magnitude changes in the ability to perform hot system upgrades.
Maturity of information management matters more then ever in the age of Big Data
Date: 2012-11-30
The maturity of information management practices in an organisation has a direct effect on the ability to achieve business goals related to supply chain optimisation, the quality of financial decisions, productivity and quality of service. The exponential growth of unstructured information is no replacement for structured information. Quite the opposite: a stream of unstructured Big Data can only be turned into tangible value once it is channelled through a distillery that extracts highly structured information accessible to human decision makers, and that can be used to provide a service to the public or to drive a commercial business model.
Last Word: Death by standardisation
Date: 2012-11-21
Some standards are undeniably useful, and the benefits of these standards can typically be quantified in terms of improvements in quality and productivity due to increases in the level of automation and interoperability. In contrast, other standards mainly fuel a certification industry that has developed around a standards body, without leading to any measurable benefits, whilst clearly adding to the operating costs of those organisations that choose to adopt such standards….
Operational frameworks for information product design and delivery
Date: 2012-10-27
Increasingly, organisations are recognising that they can benefit from a so-called software product line approach. The transition from an IT organisation that operates entirely in project delivery mode to a product development organisation that introduces a product line governance process is a significant undertaking. The process involves the designers of business information services as well as Enterprise Architects and other domain experts. Achieving the benefits of a product line approach (systematic reuse of shared assets) requires the adoption of a dedicated product line engineering methodology to guide product management, design, development, and operations, and it also requires knowing where to draw the boundary between product development and the delivery of professional services.
A framework for information management maturity assessments: is the organisation ready for Big Data?
Date: 2012-09-30
Effective data science requires a cross-disciplinary team of highly skilled experts, as well as data in sufficient quantity and quality. These requirements imply a level of maturity in information management that is beyond the capability of most organisations today. An information management maturity assessment can help determine whether an organisation is ready to embark on a big data initiative, and to identify any concrete deficits that need to be addressed.
Can Big Data change the colour of your data warehouse elephant?
Date: 2012-08-29
There are many links between the story of data warehousing and the story of SAP adoption, going all the way back to 1997, when SAP started developing a “Reporting Server”. Over the following decade SAP firmed up its dominant position as a provider of Enterprise Resource Planning functionality, creating countless business intelligence initiatives in the wake of SAP ERP implementation projects. Up to 80% of data warehouses have become white elephants, some completely abandoned, and others have been subjected to one or more resuscitation attempts.
Simplifying Service Oriented Architecture with Complex Events
Date: 2012-07-30
Direct dependencies between services represent one of the biggest mistakes in the adoption of a service oriented architecture. An event driven approach to service design and service orchestration is essential for increasing agility, for achieving reuse and scalability, and for simplifying application deployment. Complex Event Processing offers a gateway to simplicity in the orchestration of non-trivial service supply chains.
Big data services for grown-up organisations
Date: 2012-06-27
Big data not only refers to the growing amounts of netizen-generated online data, it also refers to customer expectations related to the data services provided by corporations and government departments. Increasingly corporate and individual service users expect not only a basic service, but also access to advanced tooling for data transformation, representation, and integration into other systems.
Quality of Service is discounted until you count it
Date: 2012-05-26
When conceiving and designing new services, the primary focus of product managers and technologists is often on functionality, and adequate quality of service is largely assumed as a given. Similarly, from the perspective of a potential user of a new service – the user is mainly concerned about the functional fit of the service, and is prone to making implicit assumptions about quality of service based on brief experimental use of a service.
Lean IT: a diet for those who have mastered discipline and agility
Date: 2012-04-28
Increasingly, organisations are looking beyond classical agile methodologies, towards lean techniques pioneered in industrial production. The transposition of lean techniques into the context of corporate IT is a challenge that requires a high level of process maturity and organisational discipline. The desired benefits only materialise if the lean approach is applied to processes that can be put under statistical control, and if the approach feeds into a domain engineering process that addresses the root causes of operational inefficiencies.
It pays to watch your languages
Date: 2012-03-25
All organisations are multilingual, and most, more so than may seem apparent on the surface. A systematic effort to minimise the likelihood and impact of communication problems can lead to significant cost savings, productivity improvements, and improvement of staff morale. Data quality, the quality of system integration, and the quality of product or system specifications often turn out to be the Achilles’ heel. It is a mistake to assume that the biggest potential for misunderstandings is confined to the communication between business units and the internal IT department. Whilst some IT departments could certainly benefit from learning to speak the language used by the rest of the business, the same conclusion applies to all other business units.
Last Word: Essential platform relativity theory
Date: 2012-03-22
Circa 1960: The “Hard theory of platforms” In the early days of information technology, hardware was THE platform. Companies such as IBM and DEC provided the big iron. Business software was THE application. In those days even software was as hard as stone. The term application platform was unheard of. Circa 1980: The “Soft theory of platforms” Later, in…
Business Intelligence is automation of operational management
Date: 2012-02-25
Pattern-based and repeatable processes, such as gathering operational data, validating data, and assessing data quality, offer potential for automation. The Web and software-as-a-service technologies offer powerful tools that facilitate automation beyond the simple mechanical pumping of data from one system to the next. Operational management tasks that focus on administration and control can and should be automated, so that managers have time to think about the organisation as a system, and can focus on continuous improvement.
The rise of high performance management teams
Date: 2012-01-29
The Australian Institute of Management recognises that leadership and management will need to continue to evolve to keep up with technological innovation and globalisation. Whilst organisations are usually aware of the need to keep up with technological changes, they often struggle with the practical implications for management and impact on organisational structure. On the one hand operational management can increasingly be automated, and on the other hand the ability to build and lead high performance teams is gaining in importance. Having appropriate people in executive team leadership positions is critical.
Big data, low quality and a big gap in skills
Date: 2011-12-29
Over the last decade, the volume of data that governments and private corporations collect from citizens has been eclipsed by the data produced by individuals, as photos, videos, and messages on online social platforms, and also the data produced by large scale networks of sensors that monitor traffic, weather, and industrial systems. Web users are increasingly recognising the risks of handing over data-mining rights to a very small group of organisations, whist getting very little in return. The pressure is on to develop robust solutions that not only deliver value, but also address concerns about data ownership, privacy, and the threat of data theft and abuse.
How much Enterprise Content Management is enough?
Date: 2011-11-29
Does every organisation need a dedicated ECM system? Not necessarily. Given the breadth of the topic, it is common to use a combination of different systems to adequately address enterprise wide management of content. When embarking on an ECM initiative, it is important to set clear priorities, and to explicitly define the limits of scope, otherwise the solution that is developed may primarily be a costly distraction.
Executive education in information management and technology trends
Date: 2011-10-24
Educating executives in the essentials of information management and related technology trends is an ongoing challenge. CEOs and board members are being bombarded with simplistic marketing messages from the big global IT solution vendors, as well as the messages from the most prominent local IT service providers. The same vendors usually target CIOs and senior IT managers with a bewildering set of new, “must-have” technologies every year. To avoid spending millions of IT dollars on dead ducks, vendor claims must be deconstructed into measurable aspects of product or service quality.
Domain Engineering – The missing link between customer needs and product/service design
Date: 2011-09-28
The discipline of Enterprise Architecture has evolved from the need to articulate and maintain a big picture overview of how an organisation works, covering organisational structure, processes, and systems. Whilst Enterprise Architecture can assist in implementing industry best practices, several-fold improvements in productivity and quality are only possible if the organisation makes a conscious effort to attract and retain top-level subject matter experts, and if it commits to a so-called Domain Engineering / Software Product Line approach to the strategic analysis of market needs and the design of products and services.
The Art of lock-in Part 3
Date: 2011-08-24
Lock-in to software technology always goes hand in hand with lock-in to knowledge. When using Commercial Off-The-Shelf (COTS) software, most of the lock-in relates to elements external to the organisation. In contrast, the use and development of open source software encourages development of tacit knowledge that extends into the public domain. It is time to move beyond the passive consumption of open source software, to remove business-risk inducing restrictions on the flow of knowledge, and to start actively supporting the development of open source software.
The Art of Lock-In – Part 2
Date: 2011-07-25
Lock-in is often discussed in relation to external suppliers of products and services. In doing so it is easy to overlook the lock-in relating to internal tacit knowledge and in-house custom software. The opposite of lock-in is not “no lock-in”, it is lock-in to an alternative set of behaviour and structures. Even though organisations can sometimes suffer from an excessive degree of external lock-in, organisations also benefit from lock-in, in the form of reduced costs and risk exposure. The art of lock-in involves continuously monitoring the business environment, and knowing when to switch from external to internal lock-in and vice versa.
The Art of Lock-In; Part 1
Date: 2011-06-28
To date vendors such as Microsoft and Apple have been able to exploit operating systems as an effective mechanism for creating locked-in technology ecosystems, but the emergence of the HTML5 standard and Google Chrome sees the value of such ecosystems tending towards zero. Providers of Cloud Computing services are united by the goal of minimising the relevance of in-house IT, from hardware right up to operating systems and higher-level infrastructure software. Enterprise application vendors such as SAP1 and Salesforce.com are pulling in the same direction. To avoid sunk IT costs and a dangerous level of technology lock-in, any further developments of in-house architectures and applications that ignore this trend should be re-examined.
Reconnecting software quality expectations and cost expectations
Date: 2011-05-24
In many organisations there is a major disconnect between user expectations relating to software quality attributes and expectations relating to the costs of providing applications that meet those attributes. The desire to reduce IT costs easily leads to a situation where quality is compromised to a degree that is unacceptable to users. There are three possible solutions: Invest heavily in quality assurance measures, Focus on the most important software features at the expense of less important ones, or Tap into available tacit domain knowledge to simplify the organisation, its processes, and its systems.
Last Word: Hi, this is your software talking!
Date: 2011-05-20
Software: Ah, what a day. Do you know you’re the 53,184th person today asking me for an account balance? What is it with humans, can’t you even remember the transactions you’ve performed over the last month? Anyway, your balance is $13,587.52. Is there anything else that I can help you with? Customer: Hmm, I would have expected a balance of at least $15,000. Are you sure it’s 13,500? …
Your Business is on Autopilot: The business case for knowledge reconstruction
Date: 2011-04-30
We are living in the Knowledge Age, and the operations of many organisations are critically dependent on the use of software-intensive systems. The value of operational data is well recognised, and the power struggle between the Internet superpowers such as Google, Amazon, and Facebook is largely about control over data. Knowledge however, is much more than raw data, and can be defined as the capability to transform data into valuable products and services. Today vast amounts of knowledge are expressed in the form of program source code and related data structure definitions.
QA Fundamentals: Do you know what your software does behind your back?
Date: 2011-03-30
In order to be effective, Quality Assurance must be woven into all parts of the organisational fabric. Designing, implementing, and monitoring the use of an appropriate quality management framework is the role performed by a dedicated Quality Assurance Centre of Excellence in the organisation. This internal organisation ties together QA measures that apply to core business processes and the technical QA measures that apply to IT system development and operations. Unless the QA CoE provides useful tools and metrics back to other business units, quality assurance will not be perceived as an essential activity that increases customer satisfaction ratings.
Transactions: Getting serious about collaboration
Date: 2011-02-28
The evolution of the social web 2.0 is creating a plethora of technologies for conducting transactions, with eBay, Amazon and PayPal being the most prominent players. The global financial crisis has sped up a trend towards specialised markets for peer-to-peer transactions and towards radically new business models that have the potential to transform entire industries. Consumers and SMEs are driving the change, and traditional banks and established corporations must re-focus part of their competitive edge on those areas that complement peer-to-peer transactions. Peer-to-peer exchange is as old as recorded human history, but traditionally it was limited in scope, leading to the creation of financial institutions that perform the role of a broker of trust between sellers and buyers, a role that is now being challenged by web based alternatives.
Reducing the risks inherent in using 3rd party web services
Date: 2011-01-26
The increasing reliance of software solutions on third party web services creates new kinds of risks that must be considered when designing software systems. The main difference between in-house software components and external web services is the level of control available in the event of unforeseen issues. Consequently it is prudent to invest in improving the level of fault-tolerance and…
Total cost of risks versus total cost of quality
Date: 2010-12-29
Risk management and quality are two sides of the same coin. Building quality into organisational decision-making processes and systems is only possible if operational risks are well understood. The results of risk analysis should be a key input for the design of enterprise architectures and systems. It all sounds obvious, but risks associated with the decision-making processes in an…
A data exchange mechanism is not interoperability
Date: 2010-11-24
Software products are marketed with long feature lists, and data export/import features in industry standard formats are commonly advertised – and perceived as the pinnacle of product maturity. Similarly application integration is often equated with the need for data exchange mechanisms between systems. Yet interoperability is a much wider topic, and data exchange only represents the…
Data quality matters
Date: 2010-10-26
Operational data is the heart of a business in the information age. Without operational data the organisation would cease to function, irrespective of the software and hardware infrastructure that is in place. Hence the quality of data is a logical starting point for identifying opportunities to improve business operations. When used in combination with top-down value chain analysis, a…
Untangling knots of knowledge with semantic modelling
Date: 2010-09-23
Neither written languages nor formal programming languages are capable of representing organisational knowledge in a human-friendly format. Even though Semantic Web technologies attempt to offer assistance in this area, their scope of applicability is limited to the role of establishing crude links between elements of knowledge in the public domain. Making organisational knowledge tangible…
Last word: 2010, the year of the tablet
Date: 2010-09-21
It all really started with the hype and the launch around Apple’s iPad earlier this year. Until then, tablet devices were perceived as a fringe phenomenon, of little interest to the mainstream consumer or business user.
Intelligent management of IT investments and related risks
Date: 2010-08-25
The perceived relevance of information technology varies greatly, depending on who is being asked. In software intensive industries, the overall IT budget consumes between 10–20% of the overall operating expenses. In software product development organisations the number is close to 100%. When defining the business strategy, it is easy to focus on costs and to underestimate lost…
Orientation and navigation in enterprise data warehouses
Date: 2010-07-25
Large-scale Enterprise Data Warehouse implementations and operations often lead to multi-million dollar items in annual IT budgets. It is paramount that investments of this magnitude are put to good use, and are translated into tangible value for the organisation. Complexity of the underlying information structures can become a major issue, especially once complexity impacts the ability to…
Getting value out of value chain analysis
Date: 2010-06-26
Value chain analysis is one of the fastest ways to understand the essence of a business or an organisation, provided appropriate techniques are used in the analysis. The only concepts needed for recording value chains are roles, systems, artefacts, the links between these concepts, and a distinction between artefacts that are exchanged with other organisations and artefacts that are only …
Software engineering method and theory
Date: 2010-05-26
Last year Richard Soley, Ivar Jacobson, and Bertrand Meyer called for action to re-found software engineering on principles and practices that are backed by robust scientific theories. Achieving big gains in software quality and productivity by introducing off-the-shelf methodologies has proved to be elusive. The evidence suggests that looking for much smaller (and scientifically…
Successful projects require better collaboration
Date: 2010-04-26
Knowledge workers – and people in general – commonly overestimate their ability to convey information in documents, diagrams, and in discussions. To make matters worse, they typically have too much faith in the validity of their personal mental models to frame the problems that need to be solved. As a result, misinterpretations often remain undetected for months, milestones are …
If you want results – roll your own product selection criteria
Date: 2010-03-24
When it comes to evaluating software products to address a particular business need, the first activity after determining a list of candidate products often consists of sourcing product selection criteria from independent subject matter experts. But qualified product selection is only possible if extensive information about the specific organisational context is taken into consideration,…
Uncertainty makes for a far better business case than hope
Date: 2010-02-22
It is tempting to seek out easy solutions for hard problems. Many others must have had similar problems, and a large part of the solution development effort can be short-circuited by selecting an appropriate productised solution – that’s hope. But similarities between problems in different organisations are easily over-estimated – that’s uncertainty. Business cases are …
Last word: LISP, and 50 years later, the rise of functional languages
Date: 2010-02-19
Software development was still a very esoteric discipline in the days when Lisp was born. In the meantime the software industry went through a whole series of major paradigm shifts.
The un-management of knowledge
Date: 2010-01-25
A decade ago Knowledge Management was the next big thing, and according to the analysts responsible for the Knowledge Management hype, it has evolved into a well-understood concept that is firmly established in the majority of organisations. Nothing could be further from the truth. Only very few organisations have a practically useful definition of knowledge, and even fewer realise that…
Recording business knowhow in a portable format
Date: 2009-12-29
Today business knowhow is mainly stored in two places: in human brains and in software systems. Both forms of storage share the problem that raw knowhow is not easily transferable from one context to another. Valuable knowledge is repeatedly lost through staff turnover and through technology replacements. Minimising knowledge loss requires determination and an understanding of the…
Selecting an open source enterprise service bus
Date: 2009-11-30
When designing a service-oriented architecture it is essential to provide a mechanism for connecting services from different sources. Enterprise Service Bus (ESB) technologies add value when the systems involved don’t make use of shared data formats and communication protocols. The market now includes a number of mature open source ESB technologies. Selecting the most appropriate…
Enterprise collaboration, artefacts, and value chains
Date: 2009-10-30
Given the hype around the interactive aspects of Web 2.0 and the continuing popularity of Business Process X – with X being any element of the set {Management, Modelling, Analysis, Re-engineering, Integration} – the role of artefacts in enterprise collaboration and in value chains is easily neglected. If an organisation looks beyond the hype and invests in a comprehensive and accurate…
Usability and consistency in user interfaces without the pain
Date: 2009-09-29
User interface design, implementation, and validation can easily turn out to be the most expensive part of application development, sometimes consuming over 50% of the overall project budget. This does not have to be the case. If user interface and usability requirements are specified at the appropriate level of abstraction, the required design and implementation effort can be reduced by…
Building a high-performance team
Date: 2009-08-29
Historically grown organisational structures and simplistic job descriptions sometimes stand in the way of creating a high-performance team. Taking personality attributes into account when assigning roles and responsibilities can have a measurable influence on overall costs, delivery time, functional fit of IT solutions, as well as on skill development in the team.
Replacing complexity with simplicity – one step at a time
Date: 2009-07-29
Organisations are drowning in complexity and information overload. At the same time, saving costs is at the top of the agenda. The only realistic path forward lies in tackling complexity head-on by deploying analytical techniques that help identify spurious complexity and confirm intrinsic complexity. Subsequently spurious complexity can be removed by surgical intervention, one step at a …
Last Word: Lies, damn statistics, and Facebook
Date: 2009-07-28
Recently Wired magazine featured an interview with the CEO of Facebook where Mark Zuckerberg claims that Facebook does not regard other online networking platforms as competition, but that Google is the real competitor.
Deja vu: Software as a Service
Date: 2009-06-29
Software as a service seems suspiciously familiar, bringing up old memories of time share mainframe computing systems in a different era, and more recent memories of application service provider based software offerings. Repackaging of old concepts in new terminology is a technique commonly used by software vendors. However, don’t dismiss software as a service due to a lack of technical…
The pseudo risks of web services powered by cloud vapour
Date: 2009-05-31
Proprietary web services are raising concerns about strong lock-in. Those raising the alarm bells paint a simplistic picture based on the assumption that services such as Facebook are representative of the web service landscape. Upon closer examination it appears that the doomsday prophets have a vested interest in prolonging the use of localised IT infrastructure.
Operating systems give way to web services
Date: 2009-04-29
Building valuable software solutions increasingly means building solutions that run on the web, and that are not dependent on any particular operating system. Pervasive web connectivity leads to a new paradigm for building software architectures that is based around the availability of high quality web services and around the conscious use of Open Source software in selected areas to reduce…
Low-fat ESB has significant benefits for SOA health
Date: 2009-03-28
Four years of Service Oriented Architecture hype and a middleware product diet rich in enterprise service busses are starting to take their toll. The drive towards service based application integration often goes hand in hand with unrealistic expectations and simplistic implementations. Instead of a reduction in complexity, the net effect is a shift of complexity from one implementation…
Turbocharged requirements engineering
Date: 2009-02-28
One of the weakest process elements in the software development lifecycle of most organisations is the discipline of requirements engineering. Over-investing in requirements specification amounts to speculation on behalf of the customer, and under-investing in requirements specification leads to speculation by the software development team. The optimal balance involves selecting an …
Best practices for model management
Date: 2009-02-01
One commonly used approach for model management in Unified Modeling Language (UML) tools centres on using package-based modularisation and versioning of models – but this leads to a complex and unlimited web of inter-module dependencies. Another approach consists in the use of a scalable multi-user repository, and versioning at the level of individual atomic model elements. The latter…
The DIY guide to software power tools
Date: 2008-12-31
It is now increasingly recognised that small (domain specific) modelling languages hold the key for improving productivity and quality in software design, development, configuration, interoperability, and operation. Little custom-built languages can be introduced and exploited without necessitating any changes in architectural frameworks or run-time technologies – a characteristic that…
Sharpening the focus of IT budgets
Date: 2008-11-29
During times of tight corporate budgets the IT budget is often cut down. Planned IT projects are deferred and in some cases selected running projects are cancelled. Unless a systematic and economically sound approach for allocating IT budgets is used, the result can easily backfire, leading to increased operational costs and unusable half-finished applications. Yet, if the right steps are…
The distorted world of information ownership
Date: 2008-10-31
The balance of information power is skewed in favour of knowledge intensive organisations, to the detriment of information-poor organisations and individuals. Reliable, high quality information distilled from Software as a Service users is evolving into a powerful currency that can be translated into financial profit via the sale of ad space and other techniques.
Industry Standards for Modelling Tools: Interoperability is the Horse and not the Cart
Date: 2008-09-28
Business process and software modelling tools provide a good example of a domain with an impressive number of industry standards, many of which are of questionable value. Although software modelling is an extremely valuable activity, and many of the available tools are of high quality, there are significant shortcomings in terms of practical interoperability.
Last Word: What instrumentation is needed to safely operate a financial airliner?
Date: 2008-09-27
Last month’s issue of the Communications of the Association for Computing Machinery (ACM) contained a timely article on the role of formal methods in the design and construction of software systems. The article drives home the point that much of software development today still amounts to “radical design” when viewed from the perspective of established engineering disciplines and that, to…
Building a high performance IT organisation
Date: 2008-08-28
It is all good and well to talk about alignment between business and IT, but it is easy to get trapped – either in purely theoretical business process models that bear little resemblance to reality, or in technical jargon associated with the latest and greatest implementation technologies. Given appropriate executive backing, significant productivity and quality gains can be achieved within…
Banking software headed for extinction
Date: 2008-07-28
In the current credit and liquidity market investors demand more transparency, and accurate and timely product and market information, yet most legacy banking systems are not up to the job. There is a strong business case for replacing legacy banking systems to restore organisational agility, and to improve the quality of service offered to customers.
Activating human intelligence on the Web
Date: 2008-06-28
The usefulness of Web based applications is not limited to the provision of Web-enabled front-ends to traditional business software. The Web also allows the design of applications that are capable of putting powerful human intelligence at our fingertips. Tapping into that intelligence to solve truly hard problems possibly constitutes the next disruptive innovation. Intelligence has never been…
The Industrialised Web Economy – Part 3: Automation and Model Driven Knowledge Engineering
Date: 2008-05-28
Manually re-implementing application functionality in a new technology every few years is highly uneconomical. Model driven automation offers the potential to eliminate software design degradation and to minimise the cost of technology churn. Yet the model driven approach only works if conceptual models of questionable quality are discarded, and if deep knowledge about the business is used to…
The Industrialised Web Economy- Part 2: Software Supply Chains
Date: 2008-04-28
There is a clear trend towards specialisation amongst software vendors, not limited to vertical markets, but also in terms of a concentration on specific areas in the technology landscape. As a result, many software products are becoming more focused and robust, and the opportunities for implementing modular enterprise architectures are increasing. This article is the second…
The Industrialised Web Economy – Part 1: Cloud Computing
Date: 2008-03-31
For most corporate IT departments, concepts such as Cloud Computing seem light years away from current day-to-day reality. Yet the number of commercial providers of such services is growing fast, and even more far-fetched ideas such as global software service supply chains are emerging on the horizon. The distance between innovators and late adopters of modern techniques and technologies…
Service Oriented Anarchy or Architecture?
Date: 2008-02-28
One of the more common mistakes that organisations make in implementing Service Oriented Architecture (SOA) is assuming that introducing the concept of services into the architecture and conforming to SOA-related technical industry standards amounts to a sufficient condition for the development of a maintainable software architecture. Getting software design right additionally requires a…
OpenID – The Latest Remedy for Authentication Headaches
Date: 2008-01-28
Authentication is arguably one of the biggest stumbling blocks on the road towards massive use of Software as a Service and Cloud Computing. Enabling authentication via the traditional login dialogue between individual systems and users does not scale anymore, and home-grown single sign-on architectures are largely limited to the corporate boundary. OpenID addresses the issue of…
Catch Requirements Issues and Defects before a Leaky Boat hits the Water
Date: 2007-12-28
It is well known that the cost to rectify a defect increases significantly the later the stage in the systems development life cycle it is discovered. At the same time it is well known that software requirements can only be reliably uncovered when an iterative process of validating software under construction is used. Taking full advantage of iterative requirements validation while…
How deep are your SaaS pockets?
Date: 2007-11-28
Implementing a web service oriented architecture leads to more maintainable application systems that are cheaper to operate – if you can afford to wait three years or longer, without resorting to cutting corners, or even pulling the plug. Reduction of risk exposure is the real and immediate reason why consumption and creation of services should be an essential part of renovating and …
The staying power of legacy systems
Date: 2007-10-28
There is never a good time to break the legacy cycle. A significant number of the core systems used in large corporations today have a history that extends over two or three decades. New applications, implemented in modern technologies, often still require additional functionality to be added to legacy back-end systems. But new is not necessarily better, and an educational deficit in the…
Last Word: Software Evolution vs. Software By Design
Date: 2007-09-28
A recent discussion of software development methodologies with a colleague ended in the joint conclusion that the way software is developed today apparently has a lot to do with process elements that are best described as “rituals”. Often these rituals work as expected, but sometimes they don’t.
Raising the Level of Abstraction – Is anyone still interested in traditional Source Code?
Date: 2007-09-28
In the last seven years Domain Specific Modeling and Model-Driven Software Development have emerged as fundamentally new paradigms for software development. Upon closer examination however, there is a familiar pattern at work. The new approaches represent a shift to a higher level of abstraction, not unlike the shift from assembly language to higher-level languages thirty years ago.
Domain Specific Modelling – Time to Raise the Bar for Framework Developers
Date: 2007-08-28
In the last seven years Domain Specific Modeling (DSM) and Model-Driven Software Development (MDSD/MDD) have emerged as fundamentally new paradigms for software development. Upon closer examination however, there is a familiar pattern at work. The new approaches represent a shift to a higher level of abstraction, not unlike the shift from assembly language to higher-level languages thirty years…
New times ahead for software product vendors
Date: 2007-07-28
Quasi-pervasive web connectivity in combination with more sophisticated software services that cope gracefully with short-term loss of connection are changing the landscape in which software product vendors operate. The shift in brand-awareness and power in recent years from traditional IT giants Microsoft, IBM, and Oracle towards web-based brands such as Google is one of several…
The Value of Simplicity
Date: 2007-06-28
We live in the age of personalised and mass customisable products, and this has significant implications for the software systems that enable such products or services. If configurability is added to software as an afterthought, the results are not pretty. In contrast, products or services that are personalised and configured based on intelligent interpretation of user feedback…
Global warning, databases are shrinking faster than expected
Date: 2007-05-28
The amount of information that software intensive businesses store in their databases continues to increase from year to year, fueled by demands for regulatory compliance (for example SOX1), by increasing complexity of products, and the quest for a deeper understanding of customer behaviour. Yet, in the next few years, it is likely that the increasing use of web services will lead to …
Collaboration 2.0, networks 2.0, and communities 2.0
Date: 2007-04-28
Web 2.0 ideas and technologies are still evolving rapidly, but it is possible to identify likely dimensions along which further innovation can be expected. The most mature aspect of Web 2.0 arguably consists of simple/elegant web based community tools. Investing in this area is worthwhile, but the effort should best be channelled into the one or two most relevant platforms.
The x-factor in IT service efficiency
Date: 2007-03-28
Speed, quality, and cost with which IT solutions are built and with which IT services are delivered depends on a large number of variables. Understanding and managing these variables can lead to order of magnitude improvements – neglecting them can lead to serious inefficiencies.
What kind of software development process do you need?
Date: 2007-02-28
Even when one has settled on implementing an iterative software development process, there is still a large number of approaches and process frameworks to choose from. Instead of another instance of a “method war”, it is much more productive to concentrate on techniques from different methods, which can be strung together into a working approach that is suitable for a specific context. …
Non-technological aspects of scalability
Date: 2007-01-28
All too often scalability considerations are limited to a technical discussion of implementation technology combinations, and other aspects of scalability are ignored. Organisational scalability is only achievable if not only software architecture, but also knowledge management and software portfolio management are part of the equation.
Last Word: Is Microsoft’s Dominance of the Desk Top Operating System coming to an end?
Date: 2006-12-28
Have Microsoft Operating Systems reached their best-used-by date? Ten years ago such a question would have seemed ridiculous. Today however, there are several indications that the Microsoft rule in the OS domain should no longer be considered as one of the fundamental constants of IT.
Change management: maximising the benefits of change and minimising the pain
Date: 2006-12-28
Most organisations are fairly adept at dealing with routine changes that have minimal local impact on processes and systems. The topic of change management becomes an order of magnitude more challenging when the changes in question amount to a fundamental shift in the business model or in the way in which the business model is implemented: Form needs to follow function, new approaches need…
When can you afford the SOA learning curve?
Date: 2006-11-28
Service Oriented Architecture (SOA) is used to refer to a whole variety of approaches to achieve enterprise software integration and/or some degree of reuse. By now there is reasonable consensus in the industry around the essence of service orientation, yet no Web Service standard can ever prevent implementers from making glaring mistakes in their use of the SOA concept. The number of SOA…
Sending Waterfall Software Development into Retirement
Date: 2006-10-28
Much has been written about the benefits of iterative, incremental software development. There is virtually no software development or integration project that could not benefit from an iterative approach. Yet many large, high-risk software projects are still managed according to the “good old” waterfall approach. And in those cases where projects are run in accordance to some iterative…
Effective Measurement of Software Development Productivity
Date: 2006-09-28
Estimating the cost of software development projects is notoriously difficult. The simple “thumb suck” technique still enjoys significant popularity, and although attempts to introduce a more rigorous estimation process usually lend a scientific touch to the process, any numbers that are not based on historic metrics tend to collapse like a house of cards. Obtaining useful metrics is…
Software Architecture Quality Management
Date: 2006-08-28
The quality of software architecture in a system is not something that is easily measurable; without the help of specialised tools and without the existence of a meaningful baseline or benchmark. The short life expectancy of most software systems is often explained as being due to rapid technological change. For the most part this explanation is simply a convenient excuse for tagging existing…
Enterprise Software as a Service?
Date: 2006-07-28
Compared to the consumer market, the enterprise market is more conservative when letting an external service provider store and manage its critical business information remotely, via the web. But in the face of spiralling internal IT operational costs, many companies are likely to significantly expand their use of Software as a Service (SaaS), previously known as Application Service…
Different Shades of Grey of Vendor Lock-In
Date: 2006-06-28
The potentially negative impact of vendor lock-in is unavoidable, but it can be minimised by making intelligent choices with respect to the use of technology products when building application software. In the interest of keeping the cost of lock-in at bay, IT organisations should rate the maturity of the various technologies that are being employed, consider the results in the design of…
IT Standards – Managing Commodity Products and Services
Date: 2006-05-28
The increase in IT related standards since the invention of the Web in 1989 can be seen as an indication of maturity of the IT industry. Today, all kinds of devices that contain software provide interfaces that allow them to communicate with other devices. Similarly, in the realm of enterprise software, today’s applications are typically interconnected across organisational boundaries…
Transitioning to Model Driven Software Development
Date: 2006-04-28
It is time for a major stock-take of model driven software development approaches within software intensive industries. Progress in the last few years in terms of developing interoperability standards for model driven tooling has not been spectacular. The term “Model Driven Architecture” has gone through the usual hype cycle, and the dust is in the process of settling. Model Driven…
Scaling up Agile Software Project Management
Date: 2006-03-28
Over the last five years agile software development approaches have become more popular, and are increasingly replacing heavy-handed methodologies. At the same time there is a growing interest in benchmarking the productivity of software projects, and in achieving process maturity that can be measured against certification standards such as CMMI. At first sight it would seem that these…
Open Source Software Development Tooling
Date: 2006-02-28
Open Source Software Development Tools are becoming mainstream. In the Java space, the number of available tools is mindboggling, and keeping up with the latest developments is becoming more and more a matter of being well-connected to the Open Source community and receiving tips and suggestions from trusted colleagues about the best and latest tools. It is no longer true that it is sufficient…
Understanding Model Driven Approaches to Software Development
Date: 2006-01-28
Within the software engineering community only few people fully understand the difference between the traditional use of models in software engineering, and newer so called “model-driven” approaches. In particular the discipline of Enterprise Architecture makes extensive use of modeling techniques, and mainstream practice has not yet caught up with the model-driven approaches that are …
The Next Generation CRM Paradigm
Date: 2005-11-28
The rising popularity of online business networking platforms, sometimes also referred to as social software, is the first sign that the traditional CRM paradigm that equates to “one CRM system instance for each organisation” has reached the limits of its usefulness. The players that shape a new, complementary CRM paradigm exploit pervasive use of broadband and wireless internet…
Maximising the Mileage of Software Development
Date: 2005-10-28
Practical experience shows that software development initiatives usually entail high risks for the customer and the software developer. In anticipation of the risks both parties attempt to mitigate the impact, the customer often insists on a fixed price, and the software developer consequently builds contingency into the fixed price. This simplistic mitigation strategy rarely works. Successful…
How to approach Service Oriented Architecture (SOA)
Date: 2005-09-28
SOA is an increasingly common TLA (three letter acronym), and is often thought of as a new technology – and equated with Web Services. This does injustice to Service Oriented Architecture, a new software design concept that emerged from the need to easily integrate web based applications independent of their implementation technology. Hence the adoption of SOA is not about migrating to yet…
Managing Complexity in Application Software
Date: 2005-08-28
It is easy for software development teams to be preoccupied with, and to get lost in low level design. The simplest preventative measure to curb spurious complexity, without being prescriptive at the micro-level, is to consistently make use of a nested subsystem structure within the system architecture. The result is an architecture with fewer point-to-point interfaces. This strategy …
The role of Open Source software in building durable Enterprise Architectures
Date: 2005-07-28
When it comes to design and implementation of an Enterprise Architecture, traditionally the key decisions regarding software systems have been around building vs. buying, and vendor selections based on criteria derived from business requirements. In the last five years however, many Open Source infrastructure software offerings have matured to the point of being rated best-in-class…
You must be logged in to post a comment.