## Homework 1, Tuesday 27-2-2018

1. A binary relation ~ on a set D is an equivalence relation if it is
• reflexive: p ~ p for all processes p in D,
• symmetric: if p ~ q then q ~ p,
• and transitive: if p ~ q and q ~ r then p ~ r.
1. Give an example of an equivalence relation on the domain of all people.
2. Give 2 examples of a relation on people that is not an equivalence relation because exactly one of the three requirements above fails. Which one?
3. If ~ is an equivalence relation on D, and p is in D, then [p]~ denotes the set of all processes in D that are equivalent to p. Such a set is called an equivalence class. Show (formally, by means of a mathematical proof) that the collection of all equivalence classes form a partitioning of D. This means that every p in D belongs to one and only one equivalence class.
[None of this has been treated in class; (c) is a test of prior mathematical exposure to the notion of a mathematical proof. If this is hard, it is a good warm-up for bits of mathematical proof yet to come.]
2. Consider a vending machine that accept pieces of 20 cents, 50 cents and 1 dollar, and that can dispense chocolate and pretzels, each of which costs 1 dollar. Draw an LTS whose transaction are labelled with "20c", "50c" or "$\$$1", indicating the reception by the machine of a coin inserted by a user, and by the actions "chocolate" or "pretzel", indicating the opening of the glass door of the machine behind which chocolate or pretzels are stored. (We abstract from the closing of the door after extraction of the pretzel, which happens automatically, and the appearance of a new pretzel behind the glass door afterwards.) Assume that the machine does not give change, and that it accepts any amount of surplus money. Try to make the number of states in your LTS as small as possible. 3. Draw a process graph for the ACP expression a(bc+b(c+d)). 4. Look at the definition given in the notes of the choice operator (+) of ACP in terms of LTSs. An alternative definition could be obtained by "identifying" the states IG and IH, i.e. by merging them together into one state. Is there any difference between this approach and the one of the notes? If so, illustrate this by means of examples, and tell which definition you prefer. 5. Give a formal definition of the sequential composition operator of ACP (see notes), in the style of the notes. ## Homework 2, Tuesday 6-3-2018 1. Give an example of a process that has no completed traces at all. 2. Give an example of two processes that have the same finite completed traces and the same finite partial traces, but not the same infinite traces. 3. Give an example of two processes without infinite traces that have the same partial traces but different completed traces. 4. Show, for labelled transition systems without τ-transitions, and without termination predicate, that bisimulation equivalence is strictly finer than completed trace equivalence. This means that it makes strictly fewer identifications. This involves two things: • show that whenever two processes are bisimulation equivalent, they are surely also completed trace equivalent. • give an example of two processes that are completed trace equivalent but not bisimulation equivalent. 5. Out of the following processes, which pairs are weak bisimulation equivalent? Show the bisimulation or informally describe why there isn't any.  a.(b+τ.c)+τ.a.d a.(b+τ.c)+τ.a.d+a.b a.(b+τ.c)+τ.a.d+a.c a.(b+τ.c)+τ.a.d+a.d Also tell which pairs are weak completed trace equivalent, and why or why not. And which pairs are strong completed trace equivalent, and why or why not. 6. Consider a more bureaucratic version of the relay race that was presented in class. This time runner 1 may only pass the stick to runner 2 in the presence of a bureaucrat, who is stationed halfway the start and finish. The bureaucrat will watch the transaction and subsequently make a note of it in his notebook, that will be kept for posterity. Consider a specification that mentions the start of the race (start), the delivery of the parcel (finish) and the making of the bureaucratic note (note). Describe the implementation in ACP. ## Homework 3, Tuesday 13-3-2018 1. Modify the definition of weak completed trace semantics in six different ways, namely such that 1. deadlock = livelock and fair divergence = normal progress, 2. fair divergence = normal progress, and livelock differs from deadlock and from normal progress, 3. deadlock = livelock and fair divergence differs from livelock and from normal progress, 4. livelock = fair divergence, different from deadlock and progress, 5. deadlock = livelock = fair divergence, different from progress, 6. all four phenomena are distinguished. In all cases deadlock should be distinguished from normal progress. Your notions of weak completed trace equivalence should all be strictly finer than finitary WEAK partial trace equivalence, as defined in the class notes, and strictly coarser than strong bisimilarity. Also, the definitions should be such that upon restricting attention to τ-free processes, they coincide with a known concept of strong completed trace equivalence. Finally, they should make sense from the perspective of comparing general processes (not just these examples). Added requirement 14/16-3-2018: For processes without divergence all your new notions of trace semantics should agree with (finitary or) infinitary WEAK completed trace equivalence as in the notes. 2. Out of the following processes, which pairs are branching bisimulation equivalent? Show the bisimulation or informally describe why there isn't any.  a.(b+τ.c)+τ.a.d a.(b+τ.c)+τ.a.d+a.b a.(b+τ.c)+τ.a.d+a.c a.(b+τ.c)+τ.a.d+a.d 3. Let P be the process given by the recursive equation X = a.(b.X + c.X) and Q be the process (c+e).a. Assume an ACP communication function γ given by γ(a,c)=γ(c,a)=d. 1. Draw a process graph representation of P and Q. 2. Draw a process graph representation of P||Q. 3. Draw a process graph representation of {a,c}(P||Q). 4. Consider a process called "gatekeeper" that makes sure that of 2 other processes "1" and "2", at most one is in its critical section. Its action are "hear request of process i to enter critical section", or h(i) for short; "admit process i into critical section", or a(i); and "observe that process i exits the critical section", e(i). Here i can have the values 1 or 2. Keep in mind that each of the processes i may enter the critical section multiple times. The actions h(i), a(i) and e(i) all synchronise with corresponding actions h(i), a(i) and e(i) from the processes 1 and 2. The gatekeeper should be able to hear a request in any state, subject to the restriction that each process will request entry in the critical section only once, until it has entered and exited the critical section. 1. Draw a process graph describing the correct behaviour of the gatekeeper. 2. Express the behaviour of the gatekeeper by a system of recursive equations. You are not asked to specify the processes "1" and "2" themselves. 5. Describe the relay race in CCS in a similar way that the class notes describe it in ACP. 6. Given an example of two processes of which some of the states may be marked with the termination predicate ✔, such that the two are (strong) bisimulation equivalent when discarding the termination predicate (without changing the states and transitions in any way), yet with the termination predicate taken into account they are completed trace equivalent but not bisimulation equivalent. ## Homework 4, Tuesday 20-3-2018 1. For which possible choices of Q and R are there transitions of the form a.(c.0+b.0) | ((c.0+a.0) | b.0) --τ--> Q --τ--> R in the labelled transition system of CCS? Give a proof tree of the formal derivation of one of those transitions from the structural operational rules of CCS. 2. Two process graphs, given as states in a labelled transition system, are isomorphic if they only differ in the identity of the nodes. Here is a formal definition. Given a CCS expression P, one can use the denotational semantics given in the notes to give the meaning of P as a process graph, or we can distill a process graph from the LTS generated by the structural operational semantics of CCS, namely by taking all states and transitions reachable from P. It is a theorem that those two graphs, both representing the meaning of P, are always bisimulation equivalent. Give a counterexample, showing that they are not always isomorphic. 3. Draw the graph of τ.(τ.(a+τ)+τ.a)+τ.(τ.(a+τ)+a). Colour the states, such that two states are branching bisimilar iff they have the same colour. Provide a branching bisimilar process graph with as few states as possible. 4. A preorder is a binary relation (on given set D) that is transitive and reflexive. So an equivalence relation is a preorder that moreover is symmetric. A partial order is a preorder \leq that is antisymmetric: if x \leq y and y \leq x then x=y. 1. Give an example of a partial order. 2. Give an example of a preorder that is neither a partial order nor a equivalence relation. 3. Explain how a preorder can be seen as (contains the same information as) an equivalence relation together with a partial order of the equivalence classes. 5. Consider the fragment of CCS where (for simplicity only) we drop the parallel composition, and allow recursion of the form <X | X=E> only. Here E is a CCS expression without recursion that may contain no other variables than X. Give a denotational semantics that associates with each closed CCS P expression its set of strong partial traces PT(P). We can extract PT(P) from the process graph associated to P, but this time you should define PT(P) directly, by induction on the structure of P. As an example, PT(b.Q) can be defined to be b(PT(Q)). Here PT(Q) is the set of partial traces of Q, and b(_) is an unary operator that acts on a set of traces S by adding b at the front of each trace in S, and also adding the empty string. You should give a similar definition for the other operators. The most difficult one is to define PT(<X | X=E>). It will be the least fixpoint of some operator. Which one? Explain why such a least fixpoint always exists. ## Homework 7, Tuesday 17-4-2018 1. Consider the following three pairs of LTL-formulae: 1. X(p U q) and (p U (Xq)) ∨ q; 2. ((¬p) U ((¬p) ∧ (¬q))) ⇒ ¬Gp and true; 3. (Xp) U (X(p U q)) and (X(p U q)) U (Xq). (You may want to look at the semantics of (Xp) U (Xq), and q U (q U r).) For each pair of formulae 1, φ2), determine whether φ1 and φ2 are equivalent. When your answer is "Yes" provide a proof. When your answer is "No" give a path that satisfies one formula but not the other. 2. We extend LTL by an operator L (n ∈ ℕ), which shall indicate that a formula is true if it occurs at least n-times on a path ρ. Formally, we define ρ ⊨ Lφ ⇔ ∃ k1 ... kn. (ρ[ki]⊨ φ ∧ ki≠ kj (for 0 < i,j≤ n ∧ i≠j)). Prove that this extension is not more powerful than LTL, i.e. show that L can be expressed in terms of standard LTL. 3. We introduce the CTL operator Release R. Let {a, b} be a set of atomic propositions. Assume a labelled Kripke structure A=(Q, q0, →, L) with → ⊆ Q×Q, q0 ∈ Q and L: Q2{a, b} (2{a, b} is the set of subsets of {a, b}). As usual, a path in A is an infinite sequence s0 s1 ... sk ... with si ∈ Q and (si, si+1) ∈ →, for i ≥ 0. A state s ∈ Q satisfies an atomic proposition p ∈ {a, b} iff p ∈ L(s). Intuitively, the Release operator is the dual of the Until operator. Given two atomic propositions, a and b, a R b ("a releases b") holds on a path π iff ¬b is not satisfied before a is satisfied. Formally, given an infinite path π: s0 s1 ... sk ... in A, π ⊨ a R b iff π ⊨ ¬(¬a U ¬b). 1. Does the path labelled (¬b)ω satisfy (a R b)? 2. Give witness paths that satisfy the formulas (a R b) ∧ Gb and (a R b) ∧ F(¬b), respectively. The semantics of the two CTL R operators are given by: • q ⊨ E(a R b) iff ∃ π = s0 s1 ... sk ..., s.t. s0=q and π ⊨ a R b • q ⊨ A(a R b) iff ∀ π = s0 s1 ... sk ..., s.t. s0=q, π ⊨ a R b 1. Using the semantic definitions of the CTL Until operators EU ψ) and AU ψ), prove the following equivalences: E(a R b) ⇔ ¬A(¬a U ¬b) A(a R b) ⇔ ¬E(¬a U ¬b) 2. Prove that E(a R b) ⇔ (EGb) ∨ E(b U (a ∧ b)). 3. Assume that automaton A is such that, for every state q ∈ Q, q ⊭ EGp for p ∈ {a, b}. Give a fix point algorithm to compute the truth value of E(a R b). Is it a least or greatest fix point algorithm? ## Homework 8, Tuesday 24-4-2018 1. Sequential composition (as used for instance in ACP) is usually defined on process graphs equipped with a state-labelling, where some states are marked with the symbol for successful termination, ✔. Here the second component starts only when the first one successfully terminates. Sequencing is an alternative form of sequential composition. It is a binary operation defined on any pair of process graphs; it starts with its second component as soon as its first component cannot do any further actions. This is the sequential composition that occurred in HW5. 1. Is sequential composition (as used e.g. in ACP) compositional for partial trace equivalence? Give a counterexample, or show why. 2. Is sequential composition compositional for completed trace equivalence? Give a counterexample, or show why. 3. Is partial trace equivalence a congruence for sequencing? Show why your answer is correct. 4. Is (strong) completed trace equivalence a congruence for sequencing? Show why your answer is correct. 5. Is weak completed trace equivalence =_{\rm WCT}^a as defined here a congruence for sequencing? Show why your answer is correct. 6. Same for =_{\rm WCT}^b, =_{\rm WCT}^c, =_{\rm WCT}^d, =_{\rm WCT}^e and =_{\rm WCT}^f. 7. Is weak bisimulation equivalence a congruence for sequencing? Show why your answer is correct. 8. Is strong bisimulation equivalence a congruence for sequencing? Show why your answer is correct. 2. Here is an operational semantics of the sequencing operator:  x --a--> x' ------------------ (for x' != 0) x;y --a--> x';y x --a--> 0 ------------ x;y --a--> y Is it in GSOS format? What is wrong with this operational semantics? Show that strong bisimulation is not a congruence for the operator defined by these operational rules (in the context of CCS). 3. Show that (strong) completed trace equivalence is a congruence for the abstraction operator. This means that whenever P =CT Q and I is any set of actions, it must be the case that τI(P) =CT τI(Q). 4. An equivalence ~ is a congruence for a language L if P ~ Q implies that C[P] ~ C[Q] for every context C[ ]. Here a context C[ ] is an L-expression with a hole in it, and C[P] is the result of plugging in P for the hole. The congruence closure ~L of ~ is defined by P ~L Q if and only if C[P] ~ C[Q] for any L-context C[ ]. Prove that ~L is the coarsest congruence finer than ~. 5. Consider a language L without recursion. A different definition of a congruence ~ requires for any n-ary operator f that P_i \sim Q_i for i=1,...,n implies f(P_1,...,P_n)\sim f(Q_1,...,Q_n). Show that this definition is equivalent to the one above. 6. Model the relay race in CSP. 7. How would you translate the CSP expressions (ab) ⊓ c and (ab) ☐ c into CCS? Here I omitted trailing .0's. 8. Consider the processes a.0 + b.0, τ.(a.0 + b.0), τ.a.0 + b.0, a.τ.0 + b.0 and τ.(a.0 + b.0) + a.0. 1. Which of these processes are weak bisimulation equivalent to each other, and why (or why not)? 2. Which of these processes are branching bisimulation equivalent to each other, and why (or why not)? 3. Which of these processes are rooted weak bisimulation equivalent to each other, and why (or why not)? 4. Which of these processes are rooted branching bisimulation equivalent to each other, and why (or why not)? ## Homework 9, Tuesday 1-5-2018 1. In the paper available at http://www.cse.unsw.edu.au/~rvg/pub/spectrum1.ps.gz there are a number of pictures, called Counterexample 1, 2, 3, etc. For each of counterexamples 2 to 6, provide a HML formula (if needed with infinite conjunctions) that holds for the left process, but not for the right one. A (bad) pdf of the paper is here. 2. Give an example of a process that satisfies the HML formulas [a]⟨b⟩T and ⟨a⟩([b]⟨c⟩T) but does not satisfy [a]⟨b⟩⟨c⟩T. 3. Give a CTL-X formula that shows the difference between the processes b+τ.a and b+a+τ.a, or show that there isn't any. 4. Consider the processes P, Q, R and S displayed here. Which pairs of these processes are branching bisimilar with explicit divergence? Show why your answer is correct. 5. Recall the specification and implementation of the relay race in CCS (HW 3, Q5). Prove the rooted weak bisimulation equivalence between the specification and the implementation by means of equational logic, using the equational axiomatisation of CCS. We did not treat all the axioms, but that's no problem: just list all the axioms that you need for this verification, and make sure they are sound. I'll also accept retries of Q4 from last week. ## Second chance for homework 6 (due Tuesday 1pm) For those of you that didn't do homework 6 (given by Kai) because your were are at a programming competition, or for any other reason, here is a replacement homework. Even if you did HW 6, you may still do the one below, if you like. Credit will be the maximum of the original and the replacement HW, worth the same number of points. 1. Consider the CTL formulas AF EGp and AG EFp. Give examples of Kripke structures in which one holds and not the other, and vice versa. (Or partial credit for showing Kripke structures illustrating the difference between the four formulas AFp, EFp, EGp and AGp.) 2. Model as an LTL formula: 1. If at any time the condition φ is satisfied than we will surely reach a state where ψ holds. 2. If at the condition φ is satisfied infinitely often than we will surely reach a state where ψ holds. 3. The operator θa<b always gives precedence of the specific action b over the specific action a. It does so by deleting all a-transitions from a state when there is a b-transition from that state. So θa<b(a(a+b(a+c(ba+a)))) = ab(a+cba). 1. Give a structural operational semantics of this operator. 2. Is strong bisimilarity a congruence for this operator? How do you know. 3. Is completed trace equivalence a congruence for this operator? How do you know. You may include your answers with this week's regular homework, or submit them through the interface separately (no more risk of erasing the regular HW). ## Homework 10, Tuesday 8-5-2018 ## Second chance for homework 7 (due Tuesday 1pm) 1. Consider the following six LTL-formulae: 1. FGp 2. GFp 3. FGFp 4. GFGp 5. (FGp)U(GFp) 6. (GFp)U(FGp) For each pair of formulae 1, φ2), determine whether φ1 and φ2 are equivalent. When your answer is "Yes" provide a proof. When your answer is "No" give a path that satisfies one formula but not the other. 2. Consider a Kripke structure describing the states of a mutual exclusion protocol. The purpose of the protocol is to ensure that two clients A and B are never simultaneously in the critical section, and to guarantee that both clients keep making adequate progress. The atomic propositions are CritA and CritB, saying that client A (or B) is currently in the critical section; and IntendA, IntendB, saying that client A (or B) intends to enter the critical section, but has not done so yet. 1. Formulate an LTL property in terms of these four atomic propositions expressing the correctness of the protocol. 2. Formulate an LTL property saying that if Process A intends to go critical, then Process B will enter its critical section at most once before A does. 3. CTL_{-X} has the grammar \phi,\psi ::= \top \mid \phi \wedge \psi \mid \neg \phi \mid \mbox{AF} \phi \mid \mbox{EF} \phi \mid \mbox{AG} \phi \mid \mbox{EG} \phi \mid \mbox{A} (\phi \mbox{U} \psi) \mid \mbox{E} (\phi \mbox{U} \psi). Fix the propositional part (\top \mid \phi \wedge \psi \mid \neg \phi). A subset of the remaining 6 modalities forms a basis if each remaining modality can be expressed in terms of the basis (with the propositional connectives). Give the complete list of minimal bases. Show (or argue) for each of them why it is a basis, and why it is minimal. You may include your answers with this week's regular homework prior to Monday 10am, or submit them through the interface separately after Monday 3pm. ## Homework 11, Tuesday 15-5-2018 1. Give a Petri net for the following CCS expressions: 1. a.(b.0 + c.0) 2. a.(b.0 + (c.0|\bar c.d.0)) 3. (a.(b.0 + a.0)) | a.\bar a.b.0 4. a.0+X where X is given by the recursive equation X=a.X. 2. Let's define a step-trace of a Petri net to be sequence, not of actions, but of multisets of actions, such that the represented system can perform the sequence in succession, at each step performing the required multiset of actions simultaneously. (If you don't know what a multiset is, just think of it as a set.) For example, the process a(b + (c||d)) would have the following completed step-traces: {ab, acd, adc, a[cd]} Here [cd] denotes the multiset containing c and d one time each. 1. Give the set of completed step traces of a(b + c) || d 2. Give an example of two processes that are strong bisimulation equivalent but have different step-traces. 3. Give an example of two processes that are strong bisimulation equivalent but have different minimal running times when associating durations to actions. Here durations are allocated by a function from the set of all actions to the positive real numbers. For instance duration(a)=1 and duration(b)=2. Now all occurrences of the action b will take 2 time units. You may choose any function duration that works for you. Explain why the minimal running times will be different. 4. Show that strong bisimilarity is not preserved under action refinement. That is, give two processes that are strongly bisimilar, such that after replacing each copy of a single action a with a more complicated process (like a1;a2) they are no longer strongly bisimilar. 5. Give an example of two processes that have the same completed step traces, but have different minimal running times when associating durations to actions. Explain why the minimal running times will be different. 6. Use the same example to argue that completed step trace equivalence is not preserved under action refinement. That is, give two processes with the same completed step traces, such that after replacing each copy of a single action a with a more complicated process (like a1;a2) they are no longer completed step trace equivalent. 3. Represent Peterson's algorithm (see HW10) as Petri net with 18 places: 6 places for each of the 6 internal states of the processes A and B and 2 places for each of the 3 variables. (b) Give two (essentially different) invariants for this net. 4. This process graph constitutes the correct answer to Question 1(i) from last week. The name of a state \ell_i m_j T where T is A or B indicates that process A is in state \ell_i, process B in state m_i and the variable turn has value T. The actions nA, cA, nB and cB are visible: they represent the non-critical and critical sections of processes A and B in Peterson's protocol. All unlabelled transitions are \tau's. 1. How many reachable states are in the process graph after identifying strongly bisimilar states? Tell or show which states get identified (for instance by drawing lines between equivalence classes on the sheet with the graph). (The most secure way to figure this out is by using the partition refinement algorithm, or reasoning in the spirit of that algorithm.) 2. How many reachable states are there after identifying branching bisimilar states? (In other words: what is the least amount of colours needed in a consistent colouring of this graph?) Tell or show which states get identified. 5. In the second chance for homework 7 (given last week) I asked for an LTL formula expressing the correctness of a mutual exclusion algorithm, expressed using the atomic propositions CritA and CritB, saying that process A (or B) is currently in the critical section; and IntendA, IntendB, saying that process A (or B) intends to enter the critical section, but has not done so yet. Here I will identity CritA and CritB with the transition labels cA and cB in the process graph of Peterson's protocol. 1. One answer I got was$$G(\neg CritA \vee \neg CritB) \wedge GF(CritA \vee CritB)$$The first part expresses the safety property of the protocol: never will both processes A and B be simultaneously in the critical section. The second part says the system continues to make progress: it will happen infinitely often that some process will enter the critical section. Does this formula hold for Peterson's protocol (under the same translation η from process graphs to Kripke structures we have been using in class)? Explain why this is the case or describe a run of the protocol that forms a counterexample. 2. Another answer I got was$$G(\neg (CritA \wedge CritB)) \wedge GF(CritA) \wedge GF(CritB)$$Does this formula hold for Peterson's protocol? Explain why this is the case or describe a run of the protocol that forms a counterexample. 3. Suppose that process A indicates it wants to go in the critical section. (How would it do so? See the description of homework 10, Question 1). Then one of the above formulas would allow a run in which this never happens. Which one? 4. Formulate an LTL property that expresses the correctness of mutual exclusion protocols such that the behaviour described in part (c) is excluded, and such that the formula holds for Peterson's protocol. 5. In which states should we define the proposition IntendA (resp. IntendB) to hold for the property formulated under (d) to hold? 6. In the same second chance for homework 7 homework I also asked for an LTL property \phi saying that if Process A intends to go critical, then Process B will enter its critical section at most once before A does. A blend between two correct answers I got is$$ G(IntendA \wedge enterB \Rightarrow X(\neg(enterB) \,U\, CritA))$$Here$enterB$is a shorthand for$\neg CritB \wedge X(CritB)$. This formula assumes that$IntendA$, once true, stays true until$CritA$occurs. Does this formula hold for Peterson's protocol? Explain why this is the case or describe a run of the protocol that forms a counterexample. ## Final exam Here is the exam I gave when I taught this course last time. It should give a good indication of what kind of questions (how many and how hard) to expect. ## Homework 12, Tuesday 22-5-2018 (the last one) 1. In the class notes you can find definitions of validity for old-fashioned as well as reactive CTL and LTL formulae on states in labelled transition systems. These definitions have a progress assumption built in. How do these definitions need to be changed to drop the progress assumption? 2. In this question we define the notion$P \models_B^F \phi$. It says that CTL or LTL property$\phi$holds for the process$P$under the fairness assumption$F$when assuming the set of actions$B$might be blocking. Here a (local) fairness assumption$F$consists of a task and the keyword weak or strong. A task is simply a set of transitions in the labelled transition system in which$P$occurs as a state. We call a path$F$-unfair if for some suffix$\pi$of that path (a transition from) the task is enabled in all (or infinitely many) states on$\pi$, yet never (a transition from) the task occurs in$\pi$. (Here "all" applies to weak fairness properties and "infinitely many or all" to strong ones.) A path in a Kripke structure distilled from an LTS is$F$-fair iff it stems from an$F$-fair path in the LTS. We have$P \models_B^F \phi$iff$\phi$holds for all$F$-fair$B$-complete paths starting in$P$. Now give an example$(P,B,F,\phi)$such that$P \models_B^F \phi$, but not$P \models_\emptyset^F \phi$and not$P \models_B \phi$. 3. Consider a variant of Peterson's mutual exclusion protocol that involves 4 parallel processes instead of 2, when not counting the 3 shared variables. Instead of the party "Process A" we have "Client A" and "Interface A". The client may engage in critical and/or non-critical sections. The interface does not have critical or non-critical sections; it merely negotiates with the other party when the client may enter the critical section. The same holds for B. We will only describe Interface A and Interface B, not the clients. For the interfaces should work correctly with any well-behaved clients. Client A and Interface A communicate through 3 shared actions:$r_A$is a request by the client to the interface to enter the critical section;$g_A$is a signal from the interface to the client, allowing entry to the critical section --$g$stands for "granting the request";$e_A$is a signal from the client to the interface, saying it is exiting the critical section. We assume the clients are well behaved. This means: Client$i$(where$i\in\{A,B\}$) will engage in the actions$r_i$,$g_i$and$e_i$only in this order (looping back to$r_i$after$e_i$). Whenever client$i$gets permission to enter ($g_i$), it will, after a finite period of time, leave the critical section and signal this by sending the signal$e_i$. When the interface performs the action$g_i$, it cannot be blocked by the client. There is no obligation for a client to ever do$r_i$. 1. Modify Peterson's protocol Process A into Interface A, such that it fits the above description (and works as well as it could). Do not make more modifications than necessary. 2. Describe the correctness property of the protocol as an LTL formula. Tell in which states IntendA should hold, so that this correctness criteria is satisfactory from the perspective of a client. (Thus, two clients should never be simultaneously in the critical section, and whenever a client wants to enter it should eventually do so.) 3. Does this property hold for the protocol as you described it? Explain. 4. If (c)=yes, does it still hold if you use LTL without progress assumption as you described in answer to Question 1? Explain. 5. If (c)=no, does it help to add a fairness property as described under Question 2? And what is the weakest fairness property that works? Explain. 6. Does the property hold when assuming justness? 7. Consider only the safety part of of the correctness property: two processes may not be simultaneously in the critical section. Express this property as safety(B), where B is a set of (bad) traces of the protocol. That is, what is B in this case? 4. Consider a system that performs exactly 3 actions:$c$= customer signs contract;$p$= customer pays$d$= shop delivers service to customer. 1. Model this system as Chu space, such that delivery can happen any time after customer has signed as well as payed. 2. Upgrade the above model so that it shows that paying and signing are really concurrent. 3. Model this system as Chu space, such that delivery happens as soon as customer has signed as well as payed. Indicate why this system can not so easily be described as a process graph. 5. Compare the systems$P=a.\langle X \mid X=b+c.(b+c.X)\rangle + a.\langle X \mid X=d+c.(d+c.X)\rangle$and$P=a.\langle X \mid X=b+c.(d+c.X)\rangle + a.\langle X \mid X=d+c.(b+c.X)\rangle$. 1. Are these systems equivalent in default failures semantics? (This version of failures semantics was proposed as the default semantics of the process algebra LOTOS, which has been standardised by the International Standards Organisation (ISO) in the eighties. Two processes are equivalent iff they have the same failure pairs as defined by Ray. A process$p$has a failure pair$(\sigma,X)$if$p \stackrel\sigma\Longrightarrow p'$for some$p'$with$I(p') \cap X = \emptyset$. Here$\sigma \in A^*$, where$A$is the set of actions processes can do without$\tau$, and$X \subseteq A$. The double arrow indicates that$\tau$-actions may be interspersed with the actions in the sequence$\sigma$. Furthermore$I(p') := \{a \in A \mid p' \stackrel{a}\Longrightarrow p'' \mbox{~for some $p''$}\}$, i.e. the initial actions$p'$can do, possibly after first doing some$\tau$-actions.) 2. Are these systems equivalent in stable failures semantics? (Stable failures semantics is defined as above, except with the extra requirement that$p'$cannot do a$\tau$-transition. So each stable failure pair is certainly a default failure pair, but not vice versa.) 3. Are these systems equivalent in the version of failures semantics employed in CSP? (Remember Ray's remarks on catastrophic divergence and/or do a web-search on "catastrophic divergence".) 4. Now the same 3 questions for the systems$P/c$and$Q/c$. Here$/c$is the concealment operator of CSP, identical to the abstraction operator$\tau_{\{c\}}$of ACP. It turns all$c$-transitions into$\tau\$-transitions.
5. What conclusion can you draw regarding congruence properties?

 Rob van Glabbeek notes rvg@cse.unsw.edu.au