While it so appears that this second generation of intelligent applications (based on “cognitive analytics”) are gradually becoming mainstream, harnessing the first and second generation of intelligent applications to implement end-to-end solutions to business problems (a.k.a., “business solutions”) is challenging today, as the
community is still mining the collective knowledge to discover underlying principles for designing neural network architectures and their more traditional
machine learning counterparts.
Although a variety of known techniques for implementing business solutions have been explored in specific domains and / or contexts, there does not appear to be a
consensus on combining them to power an
ecosystem and
knowledge management framework that facilitates the creation and management of intelligent active and / or autonomous business solutions across multiple different business contexts.
While the
ecosystem and framework described herein seeks to realize that vision, there are challenges to harnessing knowledge at many levels.
However, while the ANN could identify these furry animals, it was not able to connect the cat pattern to the “word “cat” so that part of the knowledge had to be provided to name the ANN, which is the “taxonomy” linkage.
Let alone the fact that a
machine would not be able to recognize associated tasks by name or know when to apply them out of context.
Various studies in many business domains have shown that employees who leave institutions after decades of service are not easily replaceable and the extent of knowledge loss for institutions is tremendous.
Let alone institutions, we all face the same situation when we lose close ones and miss the ability to leverage their knowledge and collaborate with them in our day-to-day life.
Second, ecosystems evolve when elements are added or removed, like food webs or food cycles.
The premise of such a knowledge-centered
ecosystem is to allow the proactive evolution of digital ecosystems by considering the fact that, in addition to more traditional reasons, elements may not function well or are missing entirely because they do not leverage the latest knowledge or related capabilities.
Without them, the touchpoints that are within Company X control cannot evolve, respond to their ongoing external audiences and business solutions needs, and nurture engagement and
community as necessary.
While Company X sets up these digital presences and publishes to them, these touchpoints may not be completely within Company X's control, since the
user experience design is fixed by the platform being used, and because user-generated content is a significant part of the experience.
Intelligent active and / or autonomous business solutions are designed to log traditional malfunctions (e.g., errors or warnings) as well as issues with solutions that may not function well or are missing entirely because they do not leverage the latest knowledge or related capabilities.
While many of these technology assets are outside the control of Company X when using the DKME, some of them are not.
While
RDF and triples are a useful way to formally specify entities and their properties and show the relationships between the concepts, they lack the semantic expressive power required to precisely describe and match business problems and solutions
semantics at a level that can be used to drive decisions and power intelligent active and / or autonomous business solutions that display human like abilities within that realm.
Solutions suitability and availability gaps generally result from insufficient knowledge or understanding of the solutions and / or solution components that are currently in use or possibly the lack of such solutions altogether.
However, while the Archemy™ business problems and solutions cataloguing process is designed to minimize gaps, intelligent active and / or autonomous business solutions may still need to realign their behavior on an ongoing basis to address a perceived history of their inabilities to fulfill some end users' needs.
Until
semantics are captured for problems and solutions using the most descriptive taxonomy criteria, mismatches between problems and solutions may occur.
New or existing taxonomy criteria values may need to be optimized as taxonomies evolve again possibly increasing the extent of mismatches.
Intelligent active and / or autonomous business solutions log traditional malfunctions (e.g., errors or warnings) as well as issues with solutions availability and suitability.
As a result,
metrics measure the perceived history of gaps in business problems or business solutions knowledge based on the current taxonomy, the inability to respond to a need and / or deliver satisfactory solutions, and the lack of precision and accuracy of solutions' matches.
However, nothing keeps DKMF-based business solutions architects from building P2P clusters node that have the same set of
layers as the ones used by P2P nodes as illustrated in FIG. 14.
However, nothing keeps DKMF-based business solutions architects from building P2P clusters node that have the same set of
layers as the ones used by P2P nodes as illustrated in FIG. 20.
Second, intelligent business solutions cannot simply rely on built-in knowledge that limits their ability to perform a few predetermined tasks.
While the architectural design approach described in the above leads the way to pattern-based design, patterns themselves may not always be sufficient to develop a complete design.
As we delved into the architectural design of a
software system that would enable business solutions to learn how to improve their performance of particular tasks on their own, it quickly became clear that the system could not be designed outright by just applying patterns.
Traditional industries are being disrupted and pushed to re-strategize.
This model-centric approach was prohibitively time-intensive to keep up-to-date and left too much room for error as changes to the architecture occurRed unchecked and isolated in the heads of small groups of architecture specialists.
Indeed, as complexity increases, the ingenuity needed to master it must continuously improve.
In the past, these tasks were performed by skilled Enterprise Architects on an
annuity basis and the resulting work products quickly became obsolete.
However, few studies have focused on
tacit knowledge and it is known to be complex in nature and difficult to acquire and extract.
Organizations may have different views as to how knowledge should be managed within the Enterprise, but it does not change the fact that
tacit knowledge acquisition is difficult.
The complexity of
tacit knowledge results mostly from the fact that it is personal, not easy to articulate, contextualized, job specific, both known and unknown to the holder, and therefore difficult to verbalize and share with others.
However, it is generally possible to share tacit knowledge via conversations or narrative, and it is sometimes possible to transform it into
explicit knowledge.
Finding ways to extract tacit knowledge is the main difficulty since a large amount of it is specific and resides in the heads of experts who are not available at all times. Deeply embedded in these experts' minds are insights, institutions, hunches, inherent talents, skills, experience, know-who, know-why, and working experiences.
This does not make is easy to extract justifications from experts about their decisions, methods, sequence of steps in doing a specific task.
Research in this area has demonstrated that
ranking irregularities may occur when using such methods when certain manipulations on the structure of a simple MCDA problem are performed.
As a result, different MCDA methods may yield different answers for exactly the same problem and there is, in general, no exact way to know which method gives the right answer.
The typical problem in MCDA is concerned with the task of
ranking a finite set of decision alternatives, each of which is explicitly described in terms of different characteristics (also often called attributes, decision criteria, or objectives) which must be considered simultaneously.
This method however still suffers
ranking problems that stem from the normalization and the use of an additive formula on the data of the
decision matrix for deriving the final preference values of the alternatives.
That is, the more difficult it is for one alternative to outrank another.
ELECTRE algorithms look reliable and in neat format, however they are also subject to ranking irregularities.
However, given some of the flaws in the various methods used to ensure reliable decision-making, Archemy™ is using various algorithms in its ArchNav™ tool to match business problem descriptive criteria to reusable best practice business solutions criteria and related components.
This number will likely detract from the feasibility of developing a business solution selection decision aid if the
decision maker must assign a preference to each of the criteria.
This difficulty has been examined and it has been previously postulated elsewhere that an increasing number of decision criteria impacts negatively on the reliability of an associated decision result.
First, the various DKMF algorithms allow precise and accurate matching of business problems with solutions, which has been proven to be difficult to achieve with decision-making algorithms.
Contrary to popular belief, customers moving to the Cloud still need to worry about reuse as they are now leveraging another organization's reuse program and lose
direct control over the assessment of benefits, reliability and safety of using reuse.
We are not facing a lack of reuse in
software engineering, but rather a lack of widespread, systematic reuse.
Unfortunately, fast informal solutions always seem to take precedence in a setting where pragmatic problems are the norm.
While work on reuse intensified on the late 1980s, it did not deliver on its promise to significantly increase productivity and quality.
Without a strict compliance to best practice guidelines, it would not be possible to specify precise criteria when using Archemy™ taxonomies to properly classify
software assets and add them to ArchVault™.
A proactive approach tends to require large upfront investment, and returns on investment can only be seen when products are delivered and operational.
It is typically hard for enterprises to predict their product line requirements well into the future and have the time and resources for a long development cycle.
Unless there is a clear problem boundary and projected requirements variations, it may be difficult to apply this approach.
Furthermore, if there is no sound architectural basis for the products in
a domain, a reactive approach may lead to ongoing costly refactoring of existing products as new assets are developed.
The cost of a centralized unit is typically amortized across projects within the product line.
However, this centralized approach may require a large upfront investment to mobilize an organizational unit dedicated to implementing a reuse program, and experts may have to be pulled away from ongoing projects to populate this new unit.
This centralized organizational approach is not typically successful without a strong commitment from upper management.
However, the lack of a common vision for the reuse program may make it difficult for a given project to develop a component that meets the needs of other projects and there must be a cost / benefit model in place to solicit active bilateral participation from all projects.
On one hand, a behavior
upgrade may result in increasing the node DKME subscription cost, as well as services or transactions costs.
This limited the use of Web services in complex business contexts, where the
automation of business processes was necessary.
In an automatic
composition system, the user's role is limited to specifying the functional requirements.
Frame problems typically occur when behavioral service specifications, used in composition and
verification frameworks for
Semantic Web Services, employ a
precondition / postcondition notation.
The frame problem results from the difficulty of effectively expressing what does not change, apart from what changes.
Other known potential issues with service specification-based composition and
verification framework for
Semantic Web Services are the ramification and qualification problems.
The ramification problem results from possible difficulties in representing knock-on and indirect effects (aka., ramifications).
For example, PayCompleted(doc, user) may lead to
credit card invalidation if a limit is reached.
The qualification problem results from possible difficulties dealing with qualifications that are outside our knowledge and result in inconsistent behavior (i.e., exogenous aspect).
However, it does not consider the fluent calculus extensions that solve the ramification and qualification problems, resulting in a framework that ignores their effects and fails to capitalize on the benefits of their solutions, such as expressing ramifications or explaining non-conforming behavior.
The DKMF adheres to certain safety and reliability reuse guidelines when deploying updated processes, so updated processes are not always deployed automatically.
While these capabilities are of tremendous value to enable this type of solutions, the context provided by these solutions will lead to more optimizations in automated process generation and composition / integration of business components.
However, in general, the problems associated with representing and reasoning about complex, dynamic, possibly physical environments are still essentially unsolved.
Perception of gaps by intelligent active and / or autonomous business solutions can be triggered by their inability to understand end-users' requests or serve their needs.
It can also result from induced gaps caused by regular updates to taxonomies triggered by human experts and / or
machine learning algorithms in reaction to the addition of new knowledge,
cognitive analysis of performance and feedback data (e.g., taxonomy and solutions availability matches), and feedback gathered by the DKME and the global P2P
constellation.
As noted earlier, the overall premise of our knowledge ecosystem is to allow proactive evolution of digital ecosystems by considering the fact that, in addition to more traditional reasons, elements may not function well or are missing entirely because they do not leverage the latest knowledge or related capabilities.
As these methods aimed at producing increasing
business value, their maturity and complexity also increased as shown in FIG. 56.
However,
big data systems now combine non-traditional data from multiple sources (i.e., inside and outside of the organization), multiple channels (e.g.,
social media, web logs, purchase transactions, customer service interactions), and multiple
viewpoints (e.g., context, content, sentiment, location, time).
Big data analytics has been
processing large datasets for some time now but is only able to generate insights from structured or pre-tagged data.
Therefore, they cannot predict a behavior, or pattern, or outcome that has never been seen before.
Cognitive Analytics deals with problems that are inherently uncertain; problems whose solutions are unavoidably and necessarily probabilistic.
Unstructured data creates problems that are inherently uncertain; problems whose solutions are unavoidably probabilistic.
Furthermore, unlike
big data analytics, most cognitive analytics tasks do not render themselves to the map-reduce style of
distributed computing.
Due to completely unpredictable, unstructured and string-based data the computational complexity of the reduce step tends to be the same
order of magnitude as the complexity of the original problem.
Notwithstanding the challenges, the
sky is the limit with cognitive analytics.
Of course, not all applications should be left to a machine to make the decision, but it is not unreasonable to allow a machine to mine your massive data collections active and / or autonomously for new, surprising, unexpected, important, and influential discoveries.
It was conceived to
handle the fact that, in addition to more traditional reasons, intelligent active and / or autonomous business solutions components may not always function well or may be missing entirely because they do not leverage the latest knowledge or related capabilities.
Supporting these predictive, prescriptive, and cognitive analytics functions is challenging within a P2P environment as performance and feedback data may have to be processed either locally or globally (or both) depending on the nature of the analytics to be performed (e.g., taxonomy learning via cognitive analytics is typically handled at the global level within the ArchSSV™ cluster) and the amount of data at hand.
As diagnostic analytics require the ability to provide recommendations, their current support is limited to the existing error handling capabilities within the DKMF.
This is priceless as it applies to the DKME since the lack of solutions availability drives demand for new services and the lack of solutions suitability drives the need for a more precise semantic description of services to optimize suitability feedback (e.g., improved maturity).
Again, this is all priceless in a DKME context since it is the norm for software factories to ensure systematic and formal practice of reuse and guidelines and procedures for reuse must be defined and actionable based on feedback being collected to assess reuse performance.
While Kubernetes is quite mature in Linux, it is less mature in Windows.
While DC / OS is quite mature in Linux, it is less mature in Windows.
While Service Fabric is quite mature in Windows, it is less mature in Linux.
Adding in
Internet of Things (IoT) type of data from sensors provides a level of complexity that is both a challenge and an opportunity when it comes to ontology learning.
As a result, we have limited discussions about implementation details to a minimum.
Due to the lack of standardized
knowledge management methods to classify business problems and related business solutions, new business solutions constantly have to be re-developed and sometimes need to coexist with existing business solutions.
This trend has been accelerated over the past few years as new innovative technologies keep disrupting the ability for businesses to compete in their market.
As a result, new business solutions keep proliferating, take time to develop, lack integration with existing solutions, and increase the cost and burden of doing business.
Some of the built-in functionalities provided by the DKMF are greyed out since they are not used by the Archemy™ business solution.
The main symptom is swelling in an arm or leg that may be accompanied by pain or discomfort.
At present, there are no surgical or medical interventions that can cure
lymphedema.
Forging legal contracts today requires legal knowledge and / or the intervention of costly qualified attorneys throughout the process.
Due to the lack of standards and integration methods for
information capture and monitoring devices, it is difficult to streamline the creation of integrated facilities monitoring and management solutions.