Preview

Consistency Model

Powerful Essays
Open Document
Open Document
6736 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
Consistency Model
Memory Consistency Models1
David Mosberger
TR 93/11

Abstract
This paper discusses memory consistency models and their influence on software in the context of parallel machines. In the first part we review previous work on memory consistency models. The second part discusses the issues that arise due to weakening memory consistency. We are especially interested in the influence that weakened consistency models have on language, compiler, and runtime system design. We conclude that tighter interaction between those parts and the memory system might improve performance considerably. Department of Computer Science
The University of Arizona
Tucson, AZ 85721

1

This is an updated version of [Mos93]

1 Introduction

increases.
Shared memory can be implemented at the hardware or software level. In the latter case it is usually called
Distributed Shared Memory (DSM). At both levels work has been done to reap the benefits of weaker models. We conjecture that in the near future most parallel machines will be based on consistency models significantly weaker than SC [LLG+ 92, Sit92, BZ91, CBZ91, KCZ92].
The rest of this paper is organized as follows. In section 2 we discuss issues characteristic to memory consistency models. In the following section we present several consistency models and their implications on the programming model. We then take a look at implementation options in section 4. Finally, section 5 discusses the influence of weakened memory consistency models on software. In particular, we discuss the interactions between a weakened memory system and the software using it.

Traditionally, memory consistency models were of interest only to computer architects designing parallel machines. The goal was to present a model as close as possible to the model exhibited by sequential machines.
The model of choice was sequential consistency (SC).
Sequential consistency guarantees that the result of any execution of n processors is



References: GIT-CC-92/34, Georgia Institute of Technology, Atlanta, GA 30332-0280, USA, 1992. Leslie Lamport. Time, clocks, and the ordering of events in a distributed system. Communications of the ACM, 21(7):558–565, 1978. Parallel Processing, volume II, pages 252– 257, 1990. Addison-Wesley, Reading, Massachusetts, 1987.

You May Also Find These Documents Helpful

  • Good Essays

    Some OS routines directly support application programs as they run and thus must be resident. Other transient routines are stored on disk and read into memory only when needed. Fixed-length partitions can also be used to allocate the set amount of memory that a particular program needs to run. Under dynamic memory management, the transient area is treated as a pool of unstructured free space. When the system decides to load a particular program, a region of memory just sufficient to hold the program is allocated from the pool. Using segmentation, programs are divided into independently addressed segments and stored in noncontiguous memory. Paging breaks a program into fixed-length pages.…

    • 7085 Words
    • 29 Pages
    Good Essays
  • Satisfactory Essays

    POS355 Week 1 Individual

    • 574 Words
    • 2 Pages

    There are several items that are pertinent to memory management such as, basic hardware, the binding of symbolic memory addresses to definite physical addresses and the difference between logical and physical addresses. The most important task that memory management executes is the distribution and collection of memory…

    • 574 Words
    • 2 Pages
    Satisfactory Essays
  • Good Essays

    Sharing allows several processes to access the same portion of main memory (Stallings, 2012). When there are a number of processes executing the same program it is beneficial to allow each process to access the same copy of the program rather than have its own separate copy (Stallings, 2012,). The memory management system must therefore…

    • 573 Words
    • 3 Pages
    Good Essays
  • Good Essays

    In 1974 the researchers Baddeley and Hitch argued that the picture of short-term memory (STM) provided by the Multi-Store Model was far too simple. Following the Multi-Store Model, it is believed that STM holds limited amounts of information for short periods of time with relatively little processing, it is believed to be a unitary store. This means that due to its single store it has no subsystems, unlike the Working Memory Model which has many subsystems. This proves that the Working Memory is not a unitary store.…

    • 1200 Words
    • 5 Pages
    Good Essays
  • Satisfactory Essays

    associated with this to recognize this and keep it in the memory, storage involve the changes…

    • 824 Words
    • 4 Pages
    Satisfactory Essays
  • Good Essays

    AP Psych

    • 629 Words
    • 3 Pages

    LTM is the system in which memories that are to be kept more or less permanently are stored and is unlimited in capacity and relatively permanent in duration.…

    • 629 Words
    • 3 Pages
    Good Essays
  • Satisfactory Essays

    Assignment 1,Section I

    • 326 Words
    • 2 Pages

    Several methods have been devised that increase the effectiveness of memory management. Virtual memory systems separate the memory addresses used by a process from actual physical addresses, allowing separation of processes and increasing the effectively available amount of RAM using paging or swapping to secondary storage. The quality of the virtual memory manager can have an extensive effect on overall system performanc…

    • 326 Words
    • 2 Pages
    Satisfactory Essays
  • Better Essays

    IT 600 Module One Lecture

    • 1256 Words
    • 5 Pages

    that every process gets the memory it needs, and that no application can access memory in…

    • 1256 Words
    • 5 Pages
    Better Essays
  • Satisfactory Essays

    Stage

    • 354 Words
    • 2 Pages

    Explain why this can be used as a criticism of the multi-store model of memory (4 marks).…

    • 354 Words
    • 2 Pages
    Satisfactory Essays
  • Good Essays

    Memory Dependencies

    • 529 Words
    • 3 Pages

    1. When predicting memory dependencies, what is the cost of "over predicting" (falsely predicting dependence)? What is the cost of "under predicting" (failing to predict an actual dependence)?…

    • 529 Words
    • 3 Pages
    Good Essays
  • Good Essays

    The most influential multi-store model (or MSM) was proposed by Atkinson and Shiffrin in 1968. They found out that memory is divided into a series of stages. At each stage, the information is passed from one to another and is constraints in terms of capacity, duration and encoding.…

    • 549 Words
    • 2 Pages
    Good Essays
  • Good Essays

    For any operating system to function properly, one of the vital tasks it must be able to do is manage memory. When a program runs on a computer, it first must be loaded into memory before it can execute. There are five different requirements memory management must satisfy in order to execute the program so it runs without errors or corruption. These requirements are relocation, protection, sharing, logical organization, and physical organization.…

    • 730 Words
    • 3 Pages
    Good Essays
  • Powerful Essays

    Memory Failures Diary

    • 2100 Words
    • 9 Pages

    The purpose of this assignment was to write down my memory failures that occurred throughout the semester. In my memory diary, I recorded specific details about the memory failures. I would analyze after reaching 15 memory failures and discuss the similarities and differences found. I evaluated my memory failures and related them to the memory concepts that we discussed in class.…

    • 2100 Words
    • 9 Pages
    Powerful Essays
  • Satisfactory Essays

    Levels of Processing is an influential theory of memory proposed by Craik and Lockhart (1972) which rejected the idea of the dual store model of memory. This popular model postulated that characteristics of a memory are determined by it's "location" (ie, fragile memory trace in short term store [STS] and a more durable memory trace in the long term store [LTS]. Instead, Craik and Lockhart proposed that information could be processed in a number of different ways and the durability or strength of the memory trace was a direct function of the depth of processing involved. Moreover, depth of processing was postulated to fall on a shallow to deep continuum.…

    • 597 Words
    • 2 Pages
    Satisfactory Essays
  • Good Essays

    Psychology Essay

    • 961 Words
    • 4 Pages

    This model is based on two assumptions, first that memory consist of a number of separate stores and the second is that memory process are sequential.…

    • 961 Words
    • 4 Pages
    Good Essays