[CSE]  Advanced Operating Systems 
COMP9242 2017/S2 
UNSW
CRICOS Provider
Number: 00098G

PRINTER Printer-Friendly Version

On-Line Survey 2015

Survey ID1398
TitleCOMP9242 15
DescriptionCourse Evaluation Survey for COMP9242 Advanced Operating Systems. Version for Session 2, 2015.
AnonymousYes
Fill Ratio81% (13/16)
# Filled13
# Suspended2
# Not Filled1
(required) indicates required field
Your comments will help us to assess and improve our courses, not only for future generations, but for your further study in CS&E. We really look at the results and appreciate your feedback! Several changes to the course over the years were a direct result of student feedback. And, as always, we'll publish the uncensored results on the course web site.


Note: Please do not enter "no comment" or something similar into comment boxes. If you don't have anything to say, just leave the box empty.
1. Quick Evaluation
1. Give a high rating if you have a good opinion of something (e.g. interesting, useful, well-structured, etc.). Give a low rating if you have a bad opinion of something (e.g. too slow, confusing, disorganised, etc.)  (required)
Question type : Single answer -- Radio Button
  Excellent Satisfactory Poor
Gernot Heiser 10 (77%) (23%) (0%) (0%) (0%)
Kevin Elphinstone 12 (92%) (8%) (0%) (0%) (0%)
Guest lecturer Ihor Kuz 11 (85%) (8%) (8%) (0%) (0%)
Guest lecturer Toby Murray (46%) (38%) (15%) (0%) (0%)
Guest lecturer Peter Chubb (54%) (46%) (0%) (0%) (0%)
Tutors/demonstrators (54%) (31%) (15%) (0%) (0%)
Exam (31%) (54%) (15%) (0%) (0%)
Course web pages (54%) (8%) (31%) (8%) (0%)
Reference material (38%) (31%) (15%) (8%) (8%)
Computing resources (38%) (23%) (15%) (23%) (0%)
COMP9242 overall 11 (85%) (8%) (8%) (0%) (0%)
2. General
2. Which factors most influenced your decision to enrol in this course?  (required)
Question type : Multiple answer -- Check Box
Interest in operating systems as an area of study 11 (85%) chart
Chance to build a system 11 (85%) chart
Chance to get fingers really dirty 10 (77%) chart
Would like to do some systems research (31%) chart
Looking for a challenge 11 (85%) chart
Looking for an easy course (15%) chart
Rupert told me to (8%) chart
Friends told me it was good (69%) chart
3. Other factors not mentioned above?
Question type : Short-answer
Answer at the bottom page (3 comments)
4. Would you recommend this course to another student such as yourself?  (required)
Question type : Single answer -- Radio Button
Yes 13 (100%) chart
No (0%) chart
5. The course is heavy on design and implementation issues. It also tries to remain close to present research issues (although that aspect has suffered with the move to 12 teaching weeks). What do you think about the content allocation?  (required)
Question type : Single answer -- Radio Button
  Too
much
Just
right
Too
little
Theory/general principles (0%) (23%) (69%) (8%) (0%)
OS design and implementation (0%) (15%) (46%) (38%) (0%)
Current research issues (0%) (8%) (54%) (38%) (0%)
6. What were the best things about this course?
Question type : Long-answer
Answer at the bottom page (12 comments)
7. What were the worst things about this course?
Question type : Long-answer
Answer at the bottom page (10 comments)
8. How does the workload in this course compare to workloads in other ...  (required)
Question type : Single answer -- Radio Button
  Much
Lighter
Similar Much
Heavier
COMP courses at this level (0%) (0%) (8%) (46%) (46%)
COMP courses in general (0%) (0%) (8%) (8%) 11 (85%)
Courses in general (0%) (0%) (0%) (15%) 11 (85%)
9. How does the overall quality/value of this course compare to other ...  (required)
Question type : Single answer -- Radio Button
  Among
the best
Average Among
the worst
COMP courses at this level 11 (85%) (15%) (0%) (0%) (0%)
COMP courses in general 10 (77%) (23%) (0%) (0%) (0%)
courses in general 11 (85%) (15%) (0%) (0%) (0%)
10. What background knowledge do you think you were missing that would have helped you in this course? Is distinction in COMP3231/9201 a suitable preparation? Is it too harsh?
Question type : Short-answer
Answer at the bottom page (8 comments)
3. Content/Syllabus
11. Please rate the relevance/appropriateness of the lecture topics.  (required)
Question type : Single answer -- Radio Button
  Very
relevant
Average Inappropriate N/A
Introduction: Microkernels and seL4 (62%) (31%) (8%) (0%) (0%) (0%)
Caches (46%) (31%) (8%) (15%) (0%) (0%)
OS Execution models: Threads vs Events 10 (77%) (15%) (8%) (0%) (0%) (0%)
Virtual Machines (54%) (31%) (15%) (0%) (0%) (0%)
Performance Evaluation (69%) (23%) (8%) (0%) (0%) (0%)
SMP and Locking (23%) (62%) (8%) (8%) (0%) (0%)
Real-Time Systems (23%) (38%) (38%) (0%) (0%) (0%)
Linux (38%) (23%) (31%) (0%) (0%) (8%)
Microkernel Design (62%) (23%) (8%) (0%) (0%) (8%)
Security (54%) (31%) (0%) (8%) (0%) (8%)
Multiprocessors 2, Drawbridge (38%) (31%) (23%) (0%) (0%) (8%)
Local Systems Research (23%) (46%) (31%) (0%) (0%) (0%)
Sample paper analysis (54%) (31%) (15%) (0%) (0%) (0%)
12. Please tell us how interesting you found the lecture topics.  (required)
Question type : Single answer -- Radio Button
  Very
interesting
Ok Boooooring! Skipped
Introduction: Microkernels and seL4 (46%) (15%) (31%) (8%) (0%) (0%)
Caches (46%) (8%) (31%) (15%) (0%) (0%)
OS Execution Models: Threads and Events (69%) (15%) (8%) (8%) (0%) (0%)
Virtual Machines (46%) (15%) (31%) (8%) (0%) (0%)
Performance Evaluation (54%) (8%) (23%) (15%) (0%) (0%)
SMP and Locking (54%) (15%) (31%) (0%) (0%) (0%)
Real-Time Systems (31%) (0%) (54%) (8%) (8%) (0%)
Linux (46%) (31%) (15%) (0%) (0%) (8%)
Microkernel Design (46%) (31%) (15%) (0%) (0%) (8%)
Security (54%) (15%) (15%) (8%) (0%) (8%)
Multiprocessors 2, Drawbridge (46%) (0%) (31%) (8%) (0%) (15%)
Local Systems Research (31%) (23%) (31%) (15%) (0%) (0%)
Sample paper analysis (23%) (38%) (38%) (0%) (0%) (0%)
13. Which material do you think will be most useful to you in the future?  (required)
Question type : Long-answer
Answer at the bottom page (13 comments)
14. Which material, not currently in this course, would you liked to have seen covered?
Question type : Long-answer
Answer at the bottom page (8 comments)
15. Which of the current topics would you like to see scaled back or excluded?
Question type : Long-answer
Answer at the bottom page (7 comments)
4. Lectures
16. What factors caused you to attend lectures?  (required)
Question type : Multiple answer -- Check Box
I had enough spare time (62%) chart
The lectures were too good to miss 11 (85%) chart
Given the pace and lack of a textbook, I could not afford to miss the lectures (23%) chart
It was as good a place as any to take a nap (8%) chart
I wanted to be seen to be there (15%) chart
None, I skipped most (0%) chart
17. What were the reasons for skipping lectures?  (required)
Question type : Multiple answer -- Check Box
Overall workload in this and other courses (31%) chart
Lecture notes and references cover the material adequately (8%) chart
Lectures are boring (0%) chart
There was not enough material to justify attending lectures (0%) chart
First half of the course was more interesting than second half (15%) chart
None, I attended (almost) all (69%) chart
18. Any suggestions for improving lectures?
Question type : Long-answer
Answer at the bottom page (8 comments)
5. Project
19. What was the level of difficulty various parts of the project?  (required)
Question type : Single answer -- Radio Button
  Too easy Just right Too hard
Milestone 0 (0%) (23%) (69%) (8%) (0%)
Milestone 1 (0%) (15%) 10 (77%) (8%) (0%)
Milestone 2 (15%) (8%) 10 (77%) (0%) (0%)
Milestone 3 (0%) (8%) 10 (77%) (15%) (0%)
Milestone 4 (15%) (0%) 10 (77%) (8%) (0%)
Milestone 5 (0%) (8%) (62%) (31%) (0%)
Milestone 6 (0%) (15%) (23%) (46%) (15%)
Milestone 7 (0%) (23%) (15%) (54%) (8%)
Milestone 8 (15%) (15%) (62%) (8%) (0%)
System documentation (0%) (15%) (62%) (15%) (8%)
Project overall (0%) (0%) (69%) (31%) (0%)
20. How well was the project specified?  (required)
Question type : Single answer -- Radio Button
  Very clear Ok Confusing
Milestone 0 (23%) (23%) (31%) (23%) (0%)
Milestone 1 (31%) (31%) (23%) (15%) (0%)
Milestone 2 (15%) (23%) (46%) (15%) (0%)
Milestone 3 (15%) (23%) (15%) (46%) (0%)
Milestone 4 (8%) (15%) (46%) (31%) (0%)
Milestone 5 (15%) (15%) (38%) (31%) (0%)
Milestone 6 (8%) (15%) (31%) (31%) (15%)
Milestone 7 (8%) (23%) (31%) (38%) (0%)
Milestone 8 (8%) (15%) (38%) (38%) (0%)
System documentation (8%) (31%) (31%) (31%) (0%)
Project overall (8%) (15%) (54%) (23%) (0%)
21. What was the quality of...  (required)
Question type : Single answer -- Radio Button
  Excellent Ok Poor
Documentation/reference material (8%) (54%) (0%) (23%) (15%)
Supplied code (8%) (46%) (38%) (8%) (0%)
Hardware platform (23%) (54%) (15%) (8%) (0%)
Consultation time help/support (62%) (23%) (15%) (0%) (0%)
On-line help/support (31%) (15%) (54%) (0%) (0%)
6. Project mechanics
Last year we introduced a number of changes in the way the project was done and administered. They seem to have worked well, so we kept them for this year. We're still interested in feedback though.
22. Do you have comments/opinions on the demo submission and web-based demonstration process?
Question type : Long-answer
Answer at the bottom page (8 comments)
23. Do you have comments on the compulsory use of git for revision control?
Question type : Long-answer
Answer at the bottom page (10 comments)
24. We released the project as a git "repo", with the ability to pull patches (and push pull requests for your fixes) while the project was underway. Do you have comments on this?
Question type : Long-answer
Answer at the bottom page (9 comments)
7. Summer Internships
25. Each year we encourage the students in this course to do a summer internship with us (i.e. the Trustworthy Systems group at NICTA). Typically less than half take up the opportunity, and we'd like to understand why.
Question type : Single answer -- Radio Button
I am doing a ToR/NICTA Summer Internship with Trustworthy Systems (15%) chart
I am doing a ToR/NICTA Summer Internship with another group (0%) chart
I am working with Trustworthy Systems under some other arrangement (15%) chart
I received an offer but declined (0%) chart
I applied but did not get an offer (8%) chart
I did not apply (62%) chart
N/F 0 (0%)
26. If you are not doing a ToR/Summer Internship with us, or work with us under some other arrangement, please let us know why.
Question type : Long-answer
Answer at the bottom page (11 comments)
8. Anything Else
9.
27. Tutors report some unhappiness with the way milestones or bonuses were marked. Please let us know if there were any issues.
Question type : Long-answer
Answer at the bottom page (10 comments)
28. Any other comments/suggestions that might help us to improve the course in the future?
Question type : Long-answer
Answer at the bottom page (5 comments)

3. Other factors not mentioned above?
1: it was one of the only things that fit my timetable
2: Wanted to learn rust
3: who is rupert?
6. What were the best things about this course?
1: wonderful project
2: First experience with a major software project, little to no 'hand-holding', chance to experiment.
3: Challenging project. Interesting lectures.
4: Solid programming exercise, fun design problems, interesting papers.
5: the leniency with late submissions
6: The chance to work on a semester long project, where choices in earlier weeks influence outcomes in later weeks.
7: getting to play around with an OS.
8: Challenging project and interesting material
9: The project
10: We get to play with a real OS
11: The project was excellent. The lectures were excellent.
12: Project work.
7. What were the worst things about this course?
1: Sometimes assignment requirements weren't entirely clear.
2: Grading scheme on the project, just do a Buckland! Should have prodded us to do multiserver design, given the unique expertise of the instructors
3: Tutors sometimes not knowing the spec well enough. Assessment guidelines are often unclear.
4: The project
5: milestone spec was sometimes vague. A lot like the normal OS course. sel4 was confusing to work with at first. Little guidelines and documentation on it. Marking was a bit inconsistent amongst tutors.
6: no time to relax or do anything else
7: Poor milestone specs (should be clearer what we need to do).
8: The lecture content is too general and has no much relationship with the project as well as the exam as far as I can see.
9: Incredibly vague assessment criteria, seL4 'documentation'.
10: The specs online were terrible, but apparently this is to encourage students to show up to consultations? This should be made clear.
10. What background knowledge do you think you were missing that would have helped you in this course? Is distinction in COMP3231/9201 a suitable preparation? Is it too harsh?
1: setjmp/longjmp usage
2: Add a COMP1927 and COMP2911 prereq. Make sure people can write a program
3: that is all you needed to know. everything else comes down to self driven determination and a good attitude.
4: It's an reasonable preparation
5: CS3231:DN seems about right.
6: Prereq is fine as it is. Not knowing content is fine as long we are given the resources to learn it ourselves (which we mostly were).
7: Distinction is fine. A real passion for operating systems would also be acceptable.
8: Background was okay, there was some stuff from comp3891 that I'd never really gotten, which would have been helpful. Prereqs were okay, assuming that you do well, so distinction is probably fair.
13. Which material do you think will be most useful to you in the future?  (required)
1: Caches and other architectural info, OS execution models. Performance evaluation was also really good.
2: Virtual Machines, Execution Models
3: The deisgn and performance evaluation aspects of system design.
4: Paper analysis skills, critical analysis, skepticism
5: OS Execution Models: Threads and Events and Virtual Machines
6: Execution models, multiprocessor design, security.
7: SMP and locking. Virtual machines. Performance evaluation. Paper analysis.
8: Performance evaluation. Caching.
9: Performance eval, capabilities, execution models.
10: most of the content. maybe things that are not microkernel specific like caches. they are a niche at the moment.
11: Locking, multiserver design, cache understanding, virtualisation
12: Caches, OS Execution Models, SMP and Locking
13: virtual machines, smp + locking
14. Which material, not currently in this course, would you liked to have seen covered?
1: Network stack design, given it's increasing importance with the use of virtualization and containers, especially in high availability environments that may include live migration and/or redundant processing.
2: Mobile OSes (maybe compare Linux and Android?). More current research would be nice, but probably would not be able to fit it in anyway.
3: Device drivers. Something more substantial than the timer. Possibly UART?
4: How to actually write a multiprocessor kernel (we mostly just covered that it isn't an issue in seL4).
5: mobile os like android
6: N/A
7: Hard to say
8: Overview of other OS's (windows!), possibly in place of SMP hardware design.
15. Which of the current topics would you like to see scaled back or excluded?
1: Nothing in particular.
2: Benchmarking stuff would be better done as a series of exercises - not so useful to hear the Crimes read out
3: N/A the lectures were all useful, but some were just hard to follow for 3 hours straight, with minimal sleep.
4: Some of the microkernel stuff was repeated (policy vs mechanism, minimality), was fairly understandable the first time.
5: The caching lecture felt wasted because it was early on in the semester with slightly more difficult concepts, and wasn't a requirement in the assignment so the knowledge wasn't reinforced.
6: Real-Time Systems didn't seem very appropriate. I found that the standard OS lectures sufficient.
7: microkernels were repeated a couple of times
18. Any suggestions for improving lectures?
1: Beer. Also, possibly add a reading group every second/third week? One of my favourite parts of working at NICTA that most AOS students missed out on. Might not be feasible with time restrictions.
2: more consistent recordings. the microphone that gernot brought in later had too much backgorund noise
3: seL4 is now pretty well established: we could probably cover more about building on top of it instead of how cool it is.
4: Friday Afternoon is a brutal timeslot for 3 hours of OS
5: 3 hour lectures...Even though there are breaks having a 2 and 1 hour is easier
6: it is not very project specific (but that is expected). some of the content is interesting. lectures could be shorter? 3h is a very long time for one sitting. Maybe have it split into two presentations (2h main + 1h extra topic)? 3 hours on one topic can be a bit of a drag.
7: I'm not sure how the lectures could have been improved, but some did seem to drag on for a while.
8: The lecture content seems too general. I actually forgot most of them. And they also didn't help much in terms of project and exam.
22. Do you have comments/opinions on the demo submission and web-based demonstration process?
1: It worked well
2: Seemed like a good way to do it
3: what is web based demonstration? The online submission worked quite well in most cases (using cse give). It was simple enough to apply the patch and run.
4: Web based demonstration process????? In person demo was good though
5: Demo process was good, it was good to be able to explain your design to another person, from a marking perspective as well as solidifying your own understanding.
6: This all worked pretty well. Might be nice to remind students to rehearse demos before doing them live; we always wasted time when things didn't work nicely.
7: It worked well- it would have been cool to have sabres set up inside NICTA that we could access remotely
8: Being Tuesday was difficult when public holidays fall on the Monday. Particularly when combined with some of the milestones which had ordinary specifications; very difficult not to lose marks here.
23. Do you have comments on the compulsory use of git for revision control?
1: It's good and will lead to fewer partner-murders
2: Git is best VCS.
3: Had some problems with using git hosted on CSE. Maybe suggest in the first week for people to use bitbucket etc., as we had multiple problems with the CSE hosted git repo that were all solved by bitbucket.
4: It's reasonable and we'd have used git anyway.
5: git is good Knowing it well saves a lot of headaches from merge conflicts. Never used the cse one though - although I heard cse has it's own git server now used for some courses like comp2041 which would be useful.
6: I don't think anyone used the provided repos but I could be wrong, i.e. everyone set up their own private repos on github/bitbucket etc. Github provides better functionality than bitbucket, something I wish I knew beforehand
7: git > any other revision control software
8: git is love git is life
9: git is awesome.
10: Prefer to use git, so didn't really impact me
24. We released the project as a git "repo", with the ability to pull patches (and push pull requests for your fixes) while the project was underway. Do you have comments on this?
1: It worked well
2: Worked out well
3: It was fine.
4: Only pulled a few things from upstream but it was easy to do so
5: Happy with this.
6: most effective way of doing it, given that we are already using git.
7: Using git was a good idea.
8: Likely the best available option to do this.
9: good, it came in handy for some of the fixes that needed to happen
26. If you are not doing a ToR/Summer Internship with us, or work with us under some other arrangement, please let us know why.
1: Already had work elsewhere
2: n/a
3: Already working.
4: I found a graduate position before the course started
5: I'm doing an internship at a different place over summer. I'm currently considering at doing a Special Project with NICTA/Data 61 next semester, which might be a good way to do work with you after AOS.
6: Internships at other companies.
7: I am currently working at NICTA, but I suggest you streamline your admission process so that people don't have to apply through ToR. Would be easier to be able to just send a CV straight through to NICTA. Also, possibly advertise NICTA work earlier in the year (semester 1), since most companies have finished hiring by then (eg. I was only available because I decided to turn down other offers earlier in the year).
8: I sneaked in via the verification team.
9: Applied but did not get an offer
10: ToR doesn't pay enough. Summer internship information came out after I'd accepted an external offer.
11: Accepted offer in industry.
27. Tutors report some unhappiness with the way milestones or bonuses were marked. Please let us know if there were any issues.
1: The marking criteria seemed deliberately cryptic, I suppose to encourage us to spot the 'design armpits' in advance. The course is brutal enough without this - be clear about how tutors will grade each milestone, publish their step-by-step. It would be nice to shift the 'subjective code mark' at the end to a couple of marks a week on top of a clearly communicated functionality mark. Getting wrecked at the end of the semester doesn't communicate much
2: The tutors often asked us to do things that weren't in the spec; these should either be put into the milestone specs, or students should be encouraged to go to consultations _because there are things that aren't in the spec_.
3: the main issue was the vague documentation of the spec. This meant we did not know what the tutors expected and what was worth (and not worth) marks. Sometimes we would spend time making a good design for one component, which did not matter but lost marks for others that we believed was ok. At times you got away with some tacky implementation. Along those lines, there were a lot of things that were marked in each milestone, and only one thing wrong resulted in you loosing some marks. Marking guide would be good, or at the very least things that the tutors are looking out for. Testing was also a bit iffy - since we wrote our own tests to demonstrate working milestones.
4: I didn't have an issue with it
5: The marking guidelines were highly arbitrary bordering contrived at times. When there are stringent requirements about design, performance, space efficiency, perhaps they could be indicated?
6: The hard milestone requirements get in the way sometime and waste effort.
7: Some specs were not very clear about what should be demonstrated.
8: Spec was vague, so it wasn't certain what our design was meant to do. Marking of tutors was otherwise fair, even though they seemed to be as confused about the spec as we were (with good reason).
9: Apparently the marking process is 'the tutors try and read Kevin's mind, and then try and mark us as though they are Kevin'. This doesn't seem like a recipe for success.
10: There were some cases where the milestones were unclear (vm paging milestone) and the tutors couldn't give us relevant answers. It didn't significantly affect my enjoyment of the course however
28. Any other comments/suggestions that might help us to improve the course in the future?
1: more/longer consult hours. Particularly on busy days like friday. somewhere to sleep in labs ;) better tests Overall very good and enjoyable course.
2: I was unconvinced by the quality of the libraries (LWIP for example) and build system. This resulted in many, many hours spent debugging issues which were aside from and unrelated to our codebase. I'd estimate as much as 30% of the total project time was spent here. We also had marks deducted for issues subsequently found to be an artefact of the build system.
3: The course was fantastic. The first lecture, on seL4, was quite hard to follow (knowing nothing about seL4), which made the course have a rocky start. I had no idea what to do when I started Milestone 0, so maybe some more guidance would be a good idea. In hindsight it's the easiest milestone, so maybe make it worth 0 marks and give a step-by-step explanation of what needs to be done.
4: One large benefit of the tutors was being able to talk through design decisions with. However my partner and I didn't realise this until halfway though. So maybe this could be made more explicit.
5: Loosen the milestone order requirement. Would be nice if we didn't have to partially implement parts of future milestones in order to make one milestone, then have to scrap chunks of that already implemented code later.

Last modified: 20 Jul 2016.