COMP[39]161 Concepts of Programming Languages
Semester 2, 2018

Tute (Week 13)

Table of Contents

These questions are intended as revision, although Michael's tutes may wish to cover the final STM question first, as they will not have covered STM yet.

There are a number of questions here, and it may not be possible to cover all of them in the tute. Feel free to direct your tutor and cover those questions you need help with.

1 Abstract Machines

Here is a big step semantics for a simple expression language, consisting of atomic terms \(\mathbf{P}, \mathbf{A}, \mathbf{PP}, \mathbf{AP}\), \(\mathbf{PPAP}\), and a single operator \(+\).

The big step semantics are as follows:

\[ \dfrac{}{\mathbf{P} \Downarrow \mathbf{P}}{\mathrm{Have}_1} \quad\quad \dfrac{}{\mathbf{A} \Downarrow \mathbf{A}}{\mathrm{Have}_2} \] \[ \dfrac{a \Downarrow \mathbf{P} \quad b \Downarrow \mathbf{A} }{a + b \Downarrow \mathbf{AP} }{\mathrm{Uh}_1} \] \[ \dfrac{a \Downarrow \mathbf{P} \quad b \Downarrow \mathbf{P} }{a + b \Downarrow \mathbf{PP} }{\mathrm{Uh}_2} \] \[ \dfrac{a \Downarrow \mathbf{AP} \quad b \Downarrow \mathbf{PP}}{a + b \Downarrow \mathbf{PPAP} }{\mathrm{Uh}_3} \]

a) Translate this language to an equivalent small-step semantics where all rules are axioms. Use a stack, similarly to the C-Machine, in order to keep track of the current expression under evaluation.

Assuming a stack much like the C-Machine and similar evaluation states, we have the following rules:

  1. \(s \succ \mathbf{P} \mapsto s \prec \mathbf{P}\)
  2. \(s \succ \mathbf{A} \mapsto s \prec \mathbf{A}\)
  3. \(s \succ (a + b) \mapsto (\square + b) \triangleright s \succ a\)
  4. \((\square + b) \triangleright s \prec \mathbf{P} \mapsto (\mathbf{P} + \square) \triangleright s \succ b\)
  5. \((\square + b) \triangleright s \prec \mathbf{AP} \mapsto (\mathbf{AP} + \square) \triangleright s \succ b\)
  6. \((\mathbf{P} + \square) \triangleright s \prec \mathbf{A} \mapsto s \prec \mathbf{AP}\)
  7. \((\mathbf{P} + \square) \triangleright s \prec \mathbf{P} \mapsto s \prec \mathbf{PP}\)
  8. \((\mathbf{AP} + \square) \triangleright s \prec \mathbf{PP} \mapsto s \prec \mathbf{PPAP}\)

b) This language allows terms that do not evaluate to a successful result. Add a new state to denote errors, and extend your rules so that all possible expressions evaluate to a result (which may be your error state).

Let \(\mathbf{E}\) be the error state.

  1. \(\dfrac{v \notin \{ \mathbf{P}, \mathbf{AP} \}}{(\square + b) \triangleright s \prec v \mapsto \mathbf{E}}\)
  2. \(\dfrac{v \notin \{ \mathbf{P}, \mathbf{A} \}}{(\mathbf{P} + \square) \triangleright s \prec v \mapsto \mathbf{E}}\)
  3. \(\dfrac{v \neq \mathbf{PP} }{(\mathbf{AP} + \square) \triangleright s \prec v\mapsto \mathbf{E}}\)

2 Semantic Properties

a) Mutual Exclusion is said to be a safety property, and eventual entry (or starvation freedom) is said to be a liveness property. Suppose I was impatient, and stated a stronger eventual entry property as follows:

Any process which wishes to enter its critical section will do so before the end of the day.

Is this version still a liveness property?

No, it is a safety property, as a violation can be observed after a finite amount of steps. Simply put, if I wait until the end of the day, that's a finite amount of time, after which I will be able to determine if the property has been satisfied, or not.

b) What are the properties of progress and preservation? Are they safety or liveness properties?

Progress states that well-typed terms are not stuck. Preservation states that well-typed terms evaluate to terms that have the same type.

Both of these properties are safety properties, as they comprise the type safety property.

3 Type Theory

a) Which of the following types are isomorphic to each other (assuming a strict semantics)?

  1. \(\textbf{rec}\ t.\ t \times t\)
  2. \(\textbf{0}\)
  3. \(\textbf{1}\)
  4. \(\textbf{1} \times \textbf{0}\)
  5. \(\textbf{rec}\ t.\ t \times \textbf{1}\)
  6. \(\textbf{rec}\ t.\ t + \textbf{1}\)

1, 2, 4, 5 are all isomorphic. 3 and 6 are in their own isomorphism equivalence class.

b) Write a program which is a proof of the symmetry of disjunction, i.e. \[A \lor B \Rightarrow B \lor A\].

proof = (recfun x = case x of InL a -> InR a; InR b -> InL b)

c) Lambda calculus is said to correspond to constructive logic. What is meant by constructive here?

Constructive logics require proofs to present evidence. Therefore, they do not accept principles like the law of the excluded middle and double negation elimination.

Trying to write a typed λ calculus program with a type that corresponds to \(\neg (\neg A) \rightarrow A\) is indeed rather impossible.

4 Polymorphism and Type Inference

a) What is the most general unifier of \((a \rightarrow b) \rightarrow c\) and \(x \rightarrow (y \rightarrow z)\)?

\[[x := a \rightarrow b, c := y \rightarrow z]\] Any further substitution reduces generality.

b) Why is there no inferrable type for the following term?

\[ \textbf{recfun}\ f\ x = x\ x \]

There is no inferrable type because the application \(x\ x\) requires \(x\) to take itself as an argument. We end up with a unification like \(a = a \rightarrow b\) which fails the occurs check.

5 Overloading

Here is a type class that describes how to combine values:

class Semigroup a where
  (<>) :: a -> a -> a

Provide a suitable instance for lists (containing any element type), and show the type class dictionary that results from the compilation process.

instance Semigroup [a] where
  (<>) = (++)

The dictionary in this case is a value of type \(a \rightarrow a \rightarrow a\), so the compiler would generate something like:

type SemigroupDict a = (a -> a -> a)

listSemigroupDict :: SemigroupDict [a]
listSemigroupDict = (++)

6 Subtyping

Consider the following type:

type Cont a = (a -> IO ()) -> IO () 

Is Cont covariant, contravariant or invariant? Justify your answer with coercion functions if needed.

It is covariant. Consider two types A and B and a coercion function coerce :: A -> B. Then we can write:

coerceCont :: Cont A -> Cont B
coerceCont axx bx = axx (bx . coerce) 

7 Concurrency

Consider the following two threads, each manipulating shared variables \(a\) and \(b\), both starting at zero:

Thread A: \(a := a + b + 1; a := a + b + 1\)

Thread B: \(b := a + b + 1; b := a + b + 1\)

Assume atomicity for each assignment statement. What are the possible final values of \(a\) and \(b\)?

The last three are just symmetric of the first three:

  1. a1a2b1b2 - 2, 6
  2. a1b1a2b2 - 4, 7
  3. b1a1a2b2 - 4, 6
  4. a1b1b2a2 - 6, 4
  5. b1a1b2a2 - 7 ,4
  6. b1b2a1a2 - 6, 2

Are any other final values possible if the program is altered such that the limited critical reference restriction is observed, rather than atomicity for each line?

Yes, there are many other possibilities if the atomicity is removed. For example, it's possible for \(2,2\) to occur if threads proceed in approximate lock-step.


What is an example of a scenario in which an STM-based critical section solution can suffer from starvation?

Having one long-running transaction operating while many short-lived transactions may interfere with it could cause indefinite restarting for the long-running transaction (although this is rare in practice).

2018-11-16 Fri 19:37

Announcements RSS