The Analytical Paradigm has been serving most human cultures for more than three centuries magnificently, since Sir Isaac Newton conceived it. We owe it huge scientific discoveries and all the derivatives of the Enlightenment culture, starting from the Industrial Age up to the didactic architecture of our contemporary school system.
The basic idea is that any physical phenomenon can be defined completely, splitting it into its components and identifying the relationships between them. The presence of errors is attributed to the precision limits of the available instruments and it is assumed that it it is possible to reduce it at will, when they are refined. It is also taken for granted that it is always possible in theory to reach the ideal conditions for the independence of results from observation methods, as postulated by the Galilean method.
This way of thinking seems natural today, but it was not in the XVII century. Try to think of the exhilarating sense of wonder generated by an idea that suddenly allowed humankind to calculate cosmic effects with absolute precision in a world where until a few years before it was an almost impossible task to guess where a cannon ball would have dropped a few hundred meters away. It is not surprising that people started trying to apply the Analytical Paradigm to every other field of human knowledge: from philosophy to the production of artifacts, from state organization to school education. The uninterrupted success streak of the Age of Reason justified the belief that indeed humankind was now in possession of the key to decipher the whole universe in all its aspects. In letters exchanged by mathematicians and theoretical physicists of the late nineteenth century, the idea of being close to unemployment appears recurrently. They were convinced that there was little still to do to unravel the last mysteries of Nature.
Then Einstein happened.
His correct interpretation of the photoelectric effect in 1905 opened the fruitful thirty years of the birth of quantum mechanics. Within a few years it became clear that the Analytical Paradigm was far from being considered universal. Its founding assumptions as the theoretical possibility of being able to pursue total independence between observer and observable were true only in very particular cases. The cause-effect principle was denied. All that led to a profound cultural shock: Einstein himself rejected the quantum theory that he helped to give birth to and tried to prove its falsity with no success. Today it is commonly accepted on a scientific level that the Analytical Paradigm is not valid in general, although it is extremely useful in many everyday contexts. However, since milk is not sold in subatomic stores, one century after its demonstration the conceptual falsehood of the Analytical Paradigm has not yet made much inroads into the common culture and its first crisis has gone unnoticed by most.
The same cannot be said for the crisis of its operating framework, determined by the spread of technologies for the digital representation of reality.