Models of Memory

You are here

Models of Memory

Akinson and Shiffrin suggested that memory was comprised of three separate stores, the Sensory Memory Store, the short-term memory and the long-term memory. Each store had a specific and relatively inflexible function.

Information is simply rehearsed in the STM and if rehearsed sufficiently is transferred to LTM. Information to be recalled from LTM passes back through STM producing the associated response. This model is represented below as a diagram.

The Multi-store Model .

Models of Memory

Evidence for Multi-store model:

  1. Primacy-Recency Effect - Atkinson (1970). When presented with lists to remember we recall first and last items best. First items rehearsed into LTM and last items recalled from STM. Ones in middle less likely to be recalled. This is evidence for existence of several stores.
  2. Brown-Peterson Technique suggests that if rehearsal of items is prevented then information does not enter LTM.
  3. Amnesiacs caused by Korsakoffs Syndrome brought on by chronic alcoholism display sound STM functioning but impaired LTM. This suggests separate and distinct memory stores.
  4. Shallice and Warrington (1970). Case study of K.F. who suffered brain damage because of motorbike accident. STM impaired but LTM intact.

Evidence against Multi-store Model:

  1. De Groot (1966) showed how expert chess players had phenomenal STM for chess positions as long as they fitted in with known rules. When pieces were randomly arranged their recall was no better than non chess players therefore STM and LTM may not be so separate and distinct.
  2. Multi-store model is basic and limited in explaining such a complex phenomena as memory.

    An alternative to the Multi-store Model. Emphasises workings of STM. It is a far more complex explanation of STM.

    Rather than the STM being a single inflexible store, Baddeley and Hitch suggested that the STM was made up of several subsystems, each having a specialised function.

    They suggested that these subsystems were involved in complex cognitions/thought processes, including analysis and judgements about information input.

    Baddeley and Hitch (1974) provide evidence for this by people being able to carry out more than one task at once where both tasks involve STM functions.

    Install Flash

    According to Baddeley & Hitch your willing subjects should have done both tasks successfully.

    However according to Miller and his 7 +/- 2 theory and Atkinson and Shiffrin the STM reached full capacity by attending to the letters only.

    Therefore the STM is more complex and may have several subsystems that can operate simultaneously.

    Baddeley & Hitch suggested the existence of several subsystems in STM but they studied the possibility of two in particular which were governed by a central controlling mechanism which they termed the Central Executive .

    This fat controller is the boss and supervises and coordinates the other subsidiary systems. The central executive decides which information is attended to and which parts of the working memory to send that information to be dealt with.

    The two subsystems studied were named the Visuo-spatial Sketchpad and the Phonological/Articulatory Loop.

    The Visuo-spatial Sketchpad deals with what information looks like and how it is laid out - it deals with visual and spatial information.

    The Phonological Loop holds spoken information for about 1.5 to 2 seconds. Written words must be converted to spoken words to enter phonological loop. The Articulatory loop rehearses the spoken/acoustic information from the phonological store and also converts written material to acoustic material so that the phonological loop can deal with it.

    There is little empirical evidence to support the Working memory Model but the recognition of the complexity of the STM makes sound theoretical sense. However some brain damaged patients appear to suffer impairment to some functions of STM and not others (Shallice & Warrington '1974) therefore suggesting existence of several specialised systems within STM.

    Models of Memory

    This model of memory concentrates on the LTM and the semantic processing occurring there.

    It presents another alternative to the Multi-store model which suggests information is transferred to LTM through rehearsal (repetition).

    This model suggests that the depth or level at which we process information determines its place in LTM and also how well we recall that information.

    So: the greater we think about information for whatever reason the more likely it will be remembered for longer.

    Craik & Lockhart accepted Atkinson & Shiffrins separate stores but suggested that encoding and processing of information in LTM was more complex. They suggested that information could be processed or encoded at Shallow, Deeper and Deepest levels.

    The deeper the processing the stronger and more durable the memory.

    Sample questions .

    To illustrate this have a go at the following questions:

    1. Is the word FISH in lower case or capital letters?
    2. Does the word STYLE rhyme with 'pin'?
    3. Is the word PANCAKE a form of transport?


    Work out which of these questions involves shallow processing involving only the appearance of a word.

    Which question insisted on deeper processing involving appearance and sound of the word?

    Which question insisted on deepest processing involving semantic(meaning) analysis?

    According to Craik & Lockhart the latter should be best remembered.

    The first question is shallow.

    The second question is deeper.

    The third question is deepest.

    Craik & Lockhart suggested that semantic processing can operate at different depths of analysis, some being more complex than others which they referred to as Elaborate Semantic Processing.

    Craik and Lockhart used the laboratory experiment which can be criticised in terms of validityand representativeness.

    The variables identified may be difficult to operationalise as 'depth of processing' may be seen as a highly individual - deep for one person may be shallow to another. This makes generalisations difficult.

    Bransford ('79) suggested that processing in LTM is even more complex than that proposed by Craik & Lockhart.