Computational Complexity

We are living in a golden age of digital technology, with more mobile devices than people on the planet. Still, a small percentage of people consider the complexities of these interdependent systems.

We are living in a golden age of digital technology, with more mobile devices than people on the planet. Still, a small percentage of people consider the complexities of these interdependent systems and how they contribute to overall complexity.

The Role of Programs

Every digital device runs on programs, which are human-provided instructions to computers and add it to the system. Although most of these applications are binary (1 or 0), hexadecimal (0-9, A-F) is frequently used on newer systems. Writing code entails converting it into binary or hexadecimal code that the computer can execute, which increases the complexity of the process.

Challenges in Code Writing

So, how should we write codes? You would go the quickest route, nevertheless. Our goal is to reduce the amount of time needed to run the program (Challenge B) and transform the code into binary (which I’ll refer to as Challenge A).

Challenge A: Code Conciseness

The goal is to write concise code in order to reduce the complexity involved in converting it. It can often make deciphering vast amounts of code challenging. Comparatively  speaking,  consider deciphering something. Would it take 100 requests or just one to complete?

Challenge B: Execution Time

The execution of a program must be done efficiently. Reducing the amount of time a software takes to operate is the aim.

Complexity: Measuring Program Speed

How would we go about judging the speed of various programs? Finding the time from start to end can seem easy at first, but it’s not. To determine whether this is a useful measure or not, we will monitor three things:

  • If it stays the same for different implementations, that intuitively should be the same.
  • If it stays the same for the same program, it is done again
  • If it stays the same for different computers. So, without further ado, let’s get started!
Complexity code writing
Fig 1. Measuring Program Speed

The first approach we’ll examine is the straightforward one, which involves timing the entire process. But as I just mentioned, there are more workable answers. Now let’s examine the three tests. The first one won’t work since the CPU’s level of utilisation can affect the time. For the second one, the same reasoning applies. Finally, because different computers have varying amounts of processing power, the third one will also not function. Seems like a terrible beginning! We shall measure better in the future. This method counts the number of lines and multiplies the code inside by the number of times it is repeated when it comes across a repeat statement. View our tests now.

  • This measurement suggests that it might not function because creating a counter or doing something similar slows it down.
  • Since it doesn’t involve counting anything that varies depending on the system, this will function. This is fairly effective since the same reasoning applies to three.

Now, we have only one challenge left. If the input size is n, “Big O Notation” drops the constants and removes all terms except the highest. So, 2n+3 becomes “n” and 3n^2 + 6n + 5(n^2 means n*n) becomes “n^2”.

Does this work?

Test 1: It works! Dropping constants removes the risk of different times for different implementations.

Test 2: This will also work as it has nothing to do with counting something that changes based on the system. The same thing works for test 3. So, Big O notation

is the correct measurement!

Understanding Types of Speed

Now that we’ve found the correct measurement let’s think about the different types of speed. I’ve listed the most common ones down and plotted them to explain their complexity.

The most common complexities here, from worst to best:

  • O(n!):-  n! Means n*(n-1)*(n-2)*..*2*1 so that can be extremely large.
  • O(2^n):- This means 2 multiplied by itself n times.
  • O(n^2):- This means n*n.
  • O(n log n):- log(n) is a really slowly growing function, so O(n log n) is worse than O(n) but better than O(n^2).
  • O(n):- This is quite self-explanatory; it gets larger at the same speed n gets larger.
  • O(log n):- As mentioned above, log(n) grows very slowly so it is very good.
  • O(1):- This always takes constant time and doesn’t change even when n is extremely large.

Conclusion

Navigating the complexities of code efficiency involves addressing challenges, adopting appropriate measurement methods, and understanding scalability through Big O Notation. As we contemplate the spectrum of speed complexities, from the challenging factorial growth to the efficient constant time, we gain insights essential for navigating the intricacies of modern programming.

To stay updated with the latest developments in STEM research, visit ENTECT Online. This is our digital magazine for science, technology, engineering and mathematics.

Cybersecurity for the Digital Age: Protecting Data in Every Sector
Cybersecurity threats have grown in scale and complexity over the years. What used to be simple phishing emails have evolved into highly sophisticated schemes that exploit system vulnerabilities.

From Code to No-Code Tools: The Evolution of How We Build
Once upon a time, software development was a domain reserved for coding wizards who could conjure up magic with lines of code. Fast-forward a few decades, and the landscape of tech creation has transformed dramatically. Today, we’ve arrived at a point where even non-coders can build fully functional applications,

A New AI Tool: Decoding Plant RNA
Scientists have developed a groundbreaking new artificial intelligence (AI) tool, PlantRNA-FM, designed to decode the complex world of plant RNA. As a result, this tool promises to revolutionize our understanding of how plants function. Furthermore, it opens up exciting new possibilities in areas like crop improvement

Leave Your Comment

Warning