Concurrency Problems: Lost Updates and Dirty Reads

4 min read

Modern databases are designed to support multiple users at the same time. While this improves efficiency, it also introduces the risk of concurrency problems. In IB Computer Science, students are expected to understand what concurrency is, why problems occur, and how databases prevent data corruption.

Two of the most important concurrency problems examined are lost updates and dirty reads.

What Is Concurrency?

Concurrency occurs when:

  • Multiple transactions access the same data
  • At the same time or overlapping in time

Concurrency allows:

  • Faster system response
  • Efficient use of resources
  • Multi-user access

However, without proper control, concurrency can cause data inconsistency and errors.

Why Concurrency Causes Problems

When transactions run simultaneously:

  • They may read or modify the same data
  • Changes may interfere with each other
  • The final data may be incorrect

Concurrency problems arise when:

  • Transactions are not properly isolated
  • Partial changes become visible
  • Updates overwrite each other

This is why isolation is a key ACID property.

What Is a Lost Update?

A lost update occurs when:

  • Two transactions read the same data
  • Both modify it independently
  • One update overwrites the other

As a result:

  • One transaction’s changes are lost

Example of a Lost Update

  • Transaction A reads a balance
  • Transaction B reads the same balance
  • Both calculate a new value
  • Transaction A writes its update
  • Transaction B writes its update later

The final value reflects only one change, even though two should have been applied.

In IB exams, lost updates are linked to poor isolation.

Why Lost Updates Are Dangerous

Lost updates cause:

  • Incorrect data
  • Inconsistent records
  • Loss of important changes

In systems such as:

  • Banking
  • Inventory management
  • Student records

lost updates can have serious consequences.

What Is a Dirty Read?

A dirty read occurs when:

  • One transaction reads data
  • That data has been modified by another transaction
  • The modifying transaction has not yet been committed

If the modifying transaction is later rolled back:

  • The read data was never valid

This means:

  • A transaction used temporary, unreliable data

Why Dirty Reads Are a Problem

Dirty reads cause:

  • Decisions based on incorrect data
  • Inconsistent query results
  • System instability

IB students should understand that dirty reads violate data consistency and integrity.

How Databases Prevent These Problems

Databases prevent lost updates and dirty reads using:

  • Transaction isolation
  • Locking mechanisms
  • Controlled access to data

Isolation ensures that:

  • Transactions do not interfere
  • Partial changes are hidden
  • Updates occur safely

This directly links to the Isolation property of ACID.

Concurrency Control and ACID

Concurrency problems highlight why ACID exists:

  • Atomicity prevents partial updates
  • Consistency enforces rules
  • Isolation prevents interference
  • Durability preserves committed data

IB examiners often expect students to connect concurrency problems to ACID.

Common Student Mistakes

Students often:

  • Confuse lost updates and dirty reads
  • Ignore timing of transactions
  • Forget the role of isolation
  • Describe problems too vaguely

Clear step-by-step explanations score higher.

How This Appears in IB Exams

IB questions may ask students to:

  • Explain a concurrency problem
  • Identify lost updates or dirty reads
  • Justify isolation mechanisms
  • Apply concepts to real-world systems

Logical sequencing earns marks.

Final Thoughts

Concurrency allows databases to support many users, but without control it can cause serious data errors. Lost updates overwrite changes, while dirty reads expose uncommitted data.

Understanding these problems helps IB Computer Science students explain why transaction isolation is essential — exactly what examiners expect.

Join 350k+ Students Already Crushing Their Exams