Gartner: What Customers Need to Know When Considering a Move to S/4HANA — 2018 Update

Published 8 October 2018 – ID G00363664

From the report:

By 2020, at least 35% of SAP ERP clients will be running one or more functional modules of SAP S/4HANA.
Gartner predicts that there will be a tsunami of S/4HANA adoption between 2021 and 2023, which could drive up the implementation cost as skilled resources are scarce. SAP customers may be at risk if they recruit less-experienced talent, as this could result in unsuccessful implementation programs.

It is critical to understand that, even if your company decides not to adopt S/4HANA immediately, you still have to consider the changes imminent with S/4HANA — including for all your current work on your existing ECC landscape — in order to not add further technical debt. For example, continuing to add customizations to your current ECC landscape will further complicate your migration approach and certainly increase project costs.

All existing SAP Business Suite users need to analyze the impact of S/4HANA on their ERP strategies and SAP investments. Gartner’s advice is do not wait and perform this analysis now. Many application leaders are tempted to ignore SAP S/4HANA “until it is mature” because understanding its impact is complex. This is the wrong approach because any additional investments made in current SAP deployments will be impacted by future plans regarding S/4HANA. Application leaders must ensure that the business has a clear strategic direction agreed for S/4HANA adoption roadmap before making any further investments in its current SAP ERP landscape.

Gartner has defined six categories of potential benefits that S/4HANA can deliver, based on our own analysis and growing evidence from reference customers:

  1. Performance improvements: The IMC (in-memory computing) platform of S/4HANA means that existing application processes should run faster and, in some cases, the performance improvements can be dramatic, with long-running process execution reduced from many hours to minutes, or even seconds.
  2. Real-time analytics: The combination of IMC technology and the hybrid transaction/analytical processing (HTAP) architecture of S/4HANA means that analytics can be performed in real time on transaction data (instead of extracting data to a separate instance of SAP Business Warehouse or other data warehouse platform). Also, the processing power of the IMC platform means that forecasts and simulations can be run in real time against large volumes of transaction data, something that is not possible with traditional relational architectures. The HANA IMDBMS technology includes predictive analytic algorithms (the Predictive Analytics Library), which are being leveraged by the S/4HANA product developers.
  3. Impact of the simplified architecture and associated Fiori applications: The architectural changes in S/4HANA both simplify the data schema and, in some areas, change the way existing functionality is used. For example, the new Universal Journal table in financials removes the need for reconciliation between the various ledgers in SAP financials, and should simplify month-end financial close processes. It also enables profitability analysis at lower levels of granularity through new and customizable derivation rules for profitability characteristics. Each release of S/4HANA delivers more architectural changes (for example, 1511 and 1610 have delivered real-time inventory valuation and accelerated material requirements planning).
    S/4HANA includes Fiori applications that only work with the simplified architecture. These include transaction processing applications, fact sheets (these display KPI tiles and allow further drill-down) and packaged analytics applications. These are mostly role-based and could deliver improvements in how users process transactions and access information. Each release of S/4HANA includes new Fiori applications of all types.
  4. Benefits of new S/4HANA functionality: SAP has already released several new functional capabilities that are unique to S/4HANA; for example, SAP Cash Management, Central Finance (see Note 1) and a version of SAP Business Planning and Consolidation that is optimized for S/4HANA. So far, these new capabilities have been focused on the finance domain, but it is likely that SAP will release new solutions that impact other domains in the future. Any assessment of potential S/4HANA benefits should include the impact of these new capabilities, but they may require additional licenses, so it is important to check licensing requirements with SAP.
  5. IT benefits: There will be a reduction in database size because of the simplified data architecture. There may also be some simplification of the IT landscape (for example, the need for SAP Business Warehouse may be reduced or even eliminated through the use of real-time analytics).
  6. Potential for enabling new ways of doing business: The combination of performance improvements, real-time analytics, the simplified architecture and new functionality being delivered in S/4HANA could enable significant process innovation. For example, running complex “time-bound” processes in minutes or seconds rather than hours, coupled with real-time predictive simulation and forecasting capabilities, means S/4HANA could become a real-time business management system rather than a transaction-processing system based on daily, weekly and monthly cycles. However, this may be challenging because business leaders will have to rethink and change established ways of working to realize this potential.
Objavljeno u Nekategorizirano | Ostavi komentar

Good Old SQL

While we were enjoying our beers after the match the other day, a fellow football player asked me if I used SQL.
Sure I use SQL, but until I was asked this question I never thought really systematically about reasons for using it, so here goes a very short recapitulation of benefits of using SQL for my fellow footballer, all those who are just entering that domain … and me:

  • You can do almost whatever you want with data.
  • There is no transferring of unnecessary data to the application server.
  • You can substantially reduce necessary coding logic in applications.
  • You are using a database process instead of persistency data objects tier on an application server with network speed and all the problems in between.
    I think that findWhere alike methods are good for beginners, but too simple for advanced users.

As an example, please find below a SQL select statement for fetching all SAP sales orders, which has a billing date set for today and only has service item categories.

select vbap.vbeln
from vbak,vbap,vbkd
where vbap.vbeln=vbak.vbeln and vbkd.vbeln=vbak.vbeln
and vbkd.posnr=’000000′ and vbap.uepos=’000000′ and vbkd.fkdat=current_date
group by vbap.vbeln
having sum(case when vbap.pstyv<>’TAD’ then 1 else 0 end)=0

For those sales orders we call BAPI_BILLINGDOC_CREATEMULTIPLE to automate the invoice creation process.

Fetching all items and examining them on the application level would increase network traffic and would be slower than this approach.

Objavljeno u Nekategorizirano | Ostavi komentar

Three White-collar Crime Stories

Today I will tell you a three different white-collar crime stories that all have something in common.

Story 1
An IBM Vice President in the Global Engagement Office exposed disgusting story of IBM engagement in systematic age discrimination by utilizing several methods to eliminate thousands of its employees who were over the age of forty (40).

Story 2
The four Audi executives were struggling to make a car that could meet the US’ diesel emissions standards and include a big enough AdBlue tank that the fluid would only have to be replaced every 10,000 miles. (AdBlue is a urea mixture that is injected into diesel exhaust to reduce the amount of nitrogen oxide that comes out of the tail pipe). To meet the 10,000-mile requirement, the AdBlue tank would have to encroach on some of the car’s interior, and Audi wanted vehicles with a large trunk and a high-end sound system.
So the executives came up with a brilliant idea to regulate how the AdBlue was injected into the exhaust. When the car sensed it was attached to a dynamometer (a tool used for vehicle testing in the US), it “dosed AdBlue at higher levels, ensuring compliance with US NOx emissions standards,” the indictment said. However, “during regular driving, the vehicle dosed AdBlue substantially less, which resulted in higher NOx emissions but ensured the AdBlue tanks would not run unacceptably low prior to reaching the 10,000 mile service interval.” The purpose of the conspiracy was to defraud the United States and deceiving U.S. regulators in order to obtain the necessary Certificates to sell the vehicles.

Story 3
An Oracle Senior Finance Manager, North America SaaS/Cloud Revenue, sues Oracle against her unfair dismissal by Oracle after she refused to do some “improper and suspect” accounting in regard to Oracle’s cloud computing business. Oracle has reached an out of court settlement with a former finance manager. Sapienti sat.

All three stories share the same background: top management fraud by creating schemes to hide or misrepresent what the firm does or how the firm does it. There are individuals within the corporation who deserve blame.

But why did they do it?
According to Understanding the Causes and Effects of Top Management Fraud, white-collar crimes have distinctive characteristics that include: the absence of physical violence, the existence of strong financial motivations, and the involvement of individuals who are otherwise considered respectable members of society.
White-collar crimes committed by top managers present a challenge to social class/poverty theories of crime. These theories fail to effectively explain criminality by high status individuals such as senior executives of the world’s largest corporations. These people are well paid, are in the upper socio-economic classes, and are less prone to experience strain in the traditional sense. To rise to the top of their companies, it is reasonable to assume that the values of these managers are somewhat similar to those of their broader society. Still, these managers may be subject to the strain of inflated expectations—where what they receive from their companies and jobs can never be enough.

In September 2015, Deputy Attorney General Sally Yates issued a memorandum titled Individual Accountability for Corporate Wrongdoing. In it, she stressed that one of the most effective ways to combat corporate misconduct is to hold individuals accountable. In large corporations, where responsibility can be diffuse and decisions are made at various levels, it can be difficult to determine if someone possessed the knowledge and criminal intent necessary to establish their guilt beyond a reasonable doubt. As a result, investigators often must reconstruct what happened based on a painstaking review of corporate documents, which can number in the millions, and which may be difficult to collect due to legal restrictions. Companies cannot pick and choose what facts to disclose. That is, to be eligible for any credit for cooperation, the company must identify all individuals involved in or responsible for the misconduct at issue, regardless of their position, status or seniority, and provide to the Department of Justice all facts relating to that misconduct. Once a company meets the threshold requirement of providing all relevant facts with respect to individuals, it will be eligible for consideration for cooperation credit.

Do you also have a story?

Objavljeno u Nekategorizirano | Ostavi komentar

Quantum Computing Timeline by Gartner

from Top 10 Strategic Technology Trends for 2019
Published 15 October 2018 – ID G00374252

Start planning for QC by increasing understanding of how it can apply to real-world business problems. Learn while the technology is still in an emerging state. Identify real-world problems where QC has potential and consider the possible impact on security. But don’t believe the hype that it will revolutionize either of these areas in the next few years. Most organizations should learn about and monitor QC through 2022 and perhaps exploit it from 2023 or 2025. Organizations with significant supercomputer needs, where specific quantum algorithms could provide advantage, may begin light experimentation today using QCaaS.

Track provider advances and look for clear progress in dealing with error rates, coherence times, and QC development environments and algorithms. Leverage QC vendor customer assistance programs to identify opportunities to deliver practical value back to the organization. By 2023, 20% of organizations will be budgeting for quantum computing projects compared to less than 1% today.

QC is nascent and likely to change dramatically through 2028, both in technological and architectural advancements and in algorithmic discoveries. Programming architectures and tools are proprietary and will change as the industry matures. Quantum algorithm development is the weak point of adoption. The scope and applicability of use cases will probably expand QC’s value for those who wait. Wait for QC to mature before buying it and deploying it into production, but don’t ignore QC while waiting for it to mature. Actively monitor industry progress and anticipate potentially disruptive opportunities and challenges. Identify and inventory dependency on quantum-vulnerable cryptographic algorithms, and prepare for their mitigation or replacement by creating an inventory of application dependencies. Evaluate the scope of the effects of QC and postquantum encryption on the organization’s industry by developing use cases, identifying areas of investment and understanding how competitors are preparing.

Slika | Objavljeno by | Ostavi komentar

Quantum Biology
Quantum biology has suffered from a lack of credibility until the last decade or so, when a number of intriguing studies suggested that there might be something to the idea after all. For instance, there is growing evidence that photosynthesis relies on quantum effects to help plantsturn sunlight into fuel. Migratory birds might have an internal “quantum compass” that helps them sense Earth’s magnetic fields as a means of navigation. Quantum effects might play a role in the human sense of smell, helping us distinguish between different scents.

Mathematical physicist Roger Penrose suggested in 1989 that mysterious proteins called “microtubules” might exploit quantum effects and hold the secret to human consciousness. Few researchers believe this is actually true, but Matthew Fisher, a physicist at the University of California, Santa Barbara, has recently proposed that the nuclear spins of phosphorus atoms might function as simple “qubits” in the brain. Consciousness, in other words, would work much like a quantum computer.
Current thinking holds that there may be some living systems where quantum processes could play a role before decoherence kicks in. That’s because such systems depend on the dynamics of small numbers of molecules at tiny scales (just a few nanometers), keeping them sufficiently isolated. In fact, the authors contend that recent work in quantum information theory demonstrates that noise might actually support quantum coherence in some systems. Maybe, over billions of years of evolution, nature has learned the trick of maintaining quantum coherence to make use of such effects, and we just don’t yet understand how.
The unprecedented power of the brain suggests that it may process information quantum-mechanically. Since quantum processing is already achieved in superconducting quantum computers, it may imply that superconductivity is the basis of quantum computation in brain too. Superconductivity could also be responsible for long-term memory. Following these ideas, the paper reviews the progress in the search for superconductors with high critical temperature and tries to answer the question about the superconductivity in brain. It focuses on recent electrical measurements of brain slices, in which graphene was used as a room-temperature quantum mediator, and argues that these measurements could be interpreted as providing evidence of superconductivity in the neural network of mammalian brains.
While humans are consciously experimenting with superconductivity for about 100 years, nature might be doing this subconsciously for billions of years, perfecting molecular structures from generation to generation and arriving to most efficient coherent structures able to process information in quantum-mechanical way.
The quasi-1D nanostructures in neurons, in which one could expect superconductivity, are microtubules. It is interesting that they were already suggested as structures responsible for quantum processing of information (
In the nerve cells, microtubules are packed together with shorter neurofilaments. The latter may provide electron-electron interaction necessary to form superconducting state.

Objavljeno u Nekategorizirano | Ostavi komentar

Social wave function

EU many-body entanglement is over. Social wave function collapses due to decoherence imposed from America and Russia.

The economy, empires, wars, they all have periods and they obey a fundamental principle of quantum mechanics but on the more higher superposition levels. That means they are predictable by some probabilty outputs from solving adequate quantum algorithm.

Objavljeno u Nekategorizirano | Ostavi komentar

Warren Buffett And Bill Gates 2017 Interview

Objavljeno u Nekategorizirano | Ostavi komentar