Skip to main content

mathiseasy

Mathematics is an area of knowledge that includes such topics aat numbers, formulas and related structures,shapes anthe spaces in which they are contained and quantities and their changes. One whose occupation involves the pursuit of study of mathematical truths or objects is known as a mathematician


In mathematics, the Pythagorean theorem or Pythagoras' theorem is a fundamental relation in Euclidean geometry between the three sides of a right triangle. It states that the area of the square whose side is the hypotenuse is equal to the sum of the areas of the squares on the other two sides

The Rule for Pythagoras theorem:-Pythagoras theorem states that “In a right-angled triangle, the square of the hypotenuse side is equal to the sum of squares of the other two sides“. The sides of this triangle have been named Perpendicular, Base and Hypotenuse.

And the most important thing in maths is pi:-The first calculation of π was done by Archimedes of Syracuse (287–212 BC), one of the greatest mathematicians of the ancient world. Succinctly, pi—which is written as the Greek letter for p, or π—is the ratio of the circumference of any circle to the diameter of that circle. Regardless of the circle's size, this ratio will always equal pi. In decimal form, the value of pi is approximately 3.14. It might seem like an obvious piece of any numerical system, but the zero is a surprisingly recent development in human history. In fact, this ubiquitous symbol for “nothing” didn’t even find its way to Europe until as late as the 12th century. Zero’s origins most likely date back to the “fertile crescent” of ancient Mesopotamia. Sumerian scribes used spaces to denote absences in number columns as early as 4,000 years ago, but the first recorded use of a zero-like symbol dates to sometime around the third century B.C. in ancient Babylon. The Babylonians employed a number system based around values of 60, and they developed a specific sign—two small wedges—to differentiate between magnitudes in the same way that modern decimal-based systems use zeros to distinguish between tenths, hundreds and thousandths. A similar type of symbol cropped up independently in the Americas sometime around 350 A.D., when the Mayans began using a zero marker in their calendars. Play Video Mayan Scientific AchievementsIt might seem like an obvious piece of any numerical system, but the zero is a surprisingly recent development in human history. In fact, this ubiquitous symbol for “nothing” didn’t even find its way to Europe until as late as the 12th century. Zero’s origins most likely date back to the “fertile crescent” of ancient Mesopotamia. Sumerian scribes used spaces to denote absences in number columns as early as 4,000 years ago, but the first recorded use of a zero-like symbol dates to sometime around the third century B.C. in ancient Babylon. The Babylonians employed a number system based around values of 60, and they developed a specific sign—two small wedges—to differentiate between magnitudes in the same way that modern decimal-based systems use zeros to distinguish between tenths, hundreds and thousandths. A similar type of symbol cropped up independently in the Americas sometime around 350 A.D., when the Mayans began using a zero marker in their calendars. Mayan Scientific Achievements These early counting systems only saw the zero as a placeholder—not a number with its own unique value or properties. A full grasp of zero’s importance would not arrive until the seventh century A.D. in India. There, the mathematician Brahmagupta and others used small dots under numbers to show a zero placeholder, but they also viewed the zero as having a null value, called “sunya.” Brahmagupta was also the first to show that subtracting a number from itself results in zero. From India, the zero made its way to China and back to the Middle East, where it was taken up by the mathematician Mohammed ibn-Musa al-Khowarizmi around 773. It was al-Khowarizmi who first synthesized Indian arithmetic and showed how the zero could function in algebraic equations, and by the ninth century the zero had entered the Arabic numeral system in a form resembling the oval shape we use today. The zero continued to migrate for another few centuries before finally reaching Europe sometime around the 1100s. Thinkers like the Italian mathematician Fibonacci helped introduce zero to the mainstream, and it later figured prominently in the work of Rene Descartes along with Sir Isaac Newton and Gottfried Leibniz’s invention of calculus. Since then, the concept of “nothing” has continued to play a role in the development of everything from physics and economics to engineering and computingIt might seem like an obvious piece of any numerical system, but the zero is a surprisingly recent development in human history. In fact, this ubiquitous symbol for “nothing” didn’t even find its way to Europe until as late as the 12th century. Zero’s origins most likely date back to the “fertile crescent” of ancient Mesopotamia. Sumerian scribes used spaces to denote absences in number columns as early as 4,000 years ago, but the first recorded use of a zero-like symbol dates to sometime around the third century B.C. in ancient Babylon. The Babylonians employed a number system based around values of 60, and they developed a specific sign—two small wedges—to differentiate between magnitudes in the same way that modern decimal-based systems use zeros to distinguish between tenths, hundreds and thousandths. A similar type of symbol cropped up independently in the Americas sometime around 350 A.D., when the Mayans began using a zero marker in their calendars. These early counting systems only saw the zero as a placeholder—not a number with its own unique value or properties. A full grasp of zero’s importance would not arrive until the seventh century A.D. in India. There, the mathematician Brahmagupta and others used small dots under numbers to show a zero placeholder, but they also viewed the zero as having a null value, called “sunya.” Brahmagupta was also the first to show that subtracting a number from itself results in zero. From India, the zero made its way to China and back to the Middle East, where it was taken up by the mathematician Mohammed ibn-Musa al-Khowarizmi around 773. It was al-Khowarizmi who first synthesized Indian arithmetic and showed how the zero could function in algebraic equations, and by the ninth century the zero had entered the Arabic numeral system in a form resembling the oval shape we use today. The zero continued to migrate for another few centuries before finally reaching Europe sometime around the 1100s. Thinkers like the Italian mathematician Fibonacci helped introduce zero to the mainstream, and it later figured prominently in the work of Rene Descartes along with Sir Isaac Newton and Gottfried Leibniz’s invention of calculus. Since then, the concept of “nothing” has continued to play a role in the development of everything from physics and economics to engineering and computin These early counting systems only saw the zero as a placeholder—not a number with its own unique value or properties. A full grasp of zero’s importance would not arrive until the seventh century A.D. in India. There, the mathematician Brahmagupta and others used small dots under numbers to show a zero placeholder, but they also viewed the zero as having a null value, called “sunya.” Brahmagupta was also the first to show that subtracting a number from itself results in zero. From India, the zero made its way to China and back to the Middle East, where it was taken up by the mathematician Mohammed ibn-Musa al-Khowarizmi around 773. It was al-Khowarizmi who first synthesized Indian arithmetic and showed how the zero could function in algebraic equations, and by the ninth century the zero had entered the Arabic numeral system in a form resembling the oval shape we use today. The zero continued to migrate for another few centuries before finally reaching Europe sometime around the 1100s. Thinkers like the Italian mathematician Fibonacci helped introduce zero to the mainstream, and it later figured prominently in the work of Rene Descartes along with Sir Isaac Newton and Gottfried Leibniz’s invention of calculus. Since then, the concept of “nothing” has continued to play a role in the development of everything from physics and economics to engineering and computing.  


Comments