assessment cover students please copy this page and use as the cover p
Search for question
Question
Assessment cover
STUDENTS, PLEASE COPY THIS PAGE AND USE AS THE COVER PAGE FOR
YOUR SUBMISSION
Module No:
COMP7024
Module title:
Operating Systems Security and Development
Assessment title :
Coursework
Due date and time:
Estimated total time to be spent on assignment:
35 hours per student
LEARNING OUTCOMES
On successful completion of this module, students will be able to achieve the module following
learning outcomes (LOs): LO numbers and text copied and pasted from the module descriptor
LO 1: Demonstrate a thorough understanding of the fundamentals of OS design, including process/thread, file, 10
and memory management.
LO 2: Create system-level software that modifies and extends existing operating systems. Conduct experiments
designed to evaluate the performance, security and reliability of their modifications and additions.
LO 3: Critically evaluate the security, reliability and protection in a given OS configuration. Use the results of the
evaluation to produce recommendations for hardening the system.
LO 4: Demonstrate a thorough understanding of multi-threaded/process systems through the design and
implementation of communicating, multi-threaded systems software.
Engineering Council AHEP4 LOs assessed (from S1 2022-23)
LOS copied and pasted from the AHEP4 matrix
STUDENT NAMES (ONLY IF GROUP ASSIGNMENT, OTHERWISE ANONYMOUS)
Student
No:
1.
Student Name:
Group Name and
Number:
Statement of Compliance (please tick to sign)
I declare that the work submitted is my own and that the work I submit is fully in
accordance with the University regulations regarding assessments
(www.brookes.ac.uk/uniregulations/current) COMP7024- Operating Systems Security and Development
Coursework- Part 2: OS Security Improvement
Semester 2- 2023-24
Part 2: Implementation, Testing, and Presenting the Results (50%)
Learning outcome
LO 1: Demonstrate a thorough understanding of the fundamentals of OS design, including
process/thread, file, IO and memory management.
LO 2: Create system-level software that modifies and extends existing operating systems. Conduct
experiments designed to evaluate the performance, security and reliability of their modifications and
additions.
LO 3: Critically evaluate the security, reliability and protection in a given OS configuration. Use the
results of the evaluation to produce recommendations for hardening the system.
LO 4: Demonstrate a thorough understanding of multi-threaded/process systems through the design
and implementation of communicating, multi-threaded systems software.
Task
Following coursework 1, in this part of the coursework you will
1. Implement the identified method (12.5% of the overall module mark),
2. Test and evaluate the identified method, that includes a description of the method or
experiment for evaluating the result of the work (15% of the overall module mark),
3. Discuss the achieved result and show that the implementation could improve the gaps or
areas of improvement (10% of the overall module mark),
4. Present a conclusion that summarise the work and includes some limitation in conducting
the work and possible areas for future works (10% of the overall module mark), and
5. References (using Harvard or Numerical style of referencing) and proper citation (2.5% of the
overall module mark).
Deliverable, word limit, and deadline
This exercise is worth 50% of the total marks for the module.
Your report should be structured as listed in the task section; the mark for each section listed there.
Submission
Your report must be 1500 words (excluding references, tables, figures, and individual sesion). The reports longer than 20% of the word limit will be penalised; the extra words will not be
marked.
Marks and feedback will be available on Moodle 3 weeks after submission.
This coursework is an individual piece of work. The University rules concerning plagiarism,
syndication and cheating apply. Using Al tools is allowed but your report must be your own
writing. Copying and pasting form Al tools affect your mark and can also be considered as
plagiarism as it's someone elses's work.
Version Control: You'll need to use a version control platform that records your report
development history i.e., Google Doc or GitHub. Your report will include the link
your
to
repository. If you use any other repository, you must justify this in your report Appendices.
Reports without a valid version control history will not be acceptable.
Please do Read the marking rubric to better understand what you
need to include in your report and how you should do it. Weight
Marking rubric
Section
0
1 to <50
12.5%
Implementation:
Not
Implement the
presented
identified method.
15%
Validation: Test and
evaluate the identified
method, that includes
10%
10%
a description of the
method or experiment
for evaluating the
result of the work
Discussion: Discuss the
achieved result and
show that the
implementation could
improve the gaps or
areas of improvement
Conclusion: Present a
conclusion that
Inadequately
addressed:
Poor quality
content;
incomplete
/irrelevant.
50 to <60
Good: Good implementation,
addressing most of the
required features as explained
in the first part of the
coursework.
Good: Good validation section
with a clear test plan, covering
most of the implemented
features with a presentation of
the results.
Good: Good discussion section
with some explanation of
improvements achieved from
conducting the work and
presenting some analysis for
the achieved results.
Good: Good conclusion section
that somehow summarises the
conducted work, achieved
Mark distribution
60 to <70
Very good: A very good
implementation, addressing
almost all the required features
as explained in the first part of
the coursework.
Vary good: A very good
validation section with a clear
and concise test plan, covering
almost all the implemented
features with a clear
presentation of the results.
Very good: A very good
discussion section with a clear
explanation of improvements
achieved from conducting the
work and presenting some
comparison of the results with
some similar past work/system
before applying the suggested
changes.
Very good: A very good
conclusion section that clearly
and almost fully summarises
70 to <100
Excellent: Excellent implementation,
complete and concise following all the
required features as explained in the first
part of the coursework.
Excellent: Excellent validation section
with a clear and concise test plan,
covering all the implemented features
with a clear and complete presentation
of the results.
Excellent: Excellent discussion section
with a clear and concise explanation of
improvements achieved from conducting
the work focusing on the results and
comparing them with some similar past
work/system before applying the
suggested changes.
Excellent: Excellent conclusion section
that clearly, concisely, and fully
summarises the conducted work,
'
100
Perfect:
Complete,
precise, clear,
recent,
well-structur
ed,
well-written,
parsimonious
and covers
all listed in
the previous
column. No
evidence of
using Al in
completing
the
work/report. 2.5%
summarise the work
and includes some
limitation in
conducting the work
and possible areas for
future works.
References (using
Harvard or Numerical
style of referencing).
result, the importance of the
work, potential market and
benefit, limitations faced,
along with some potential
improvement that can be
made to complete the work as
potential future work.
Acceptable: An acceptable list
of some related valid
references provided, which
some cited inside the report.
the conducted work, achieved
result, the importance of the
work, potential market and
benefit, limitations faced, along
with some potential
improvement that can be made
to complete the work as
potential future work.
Very Good: A very good list of
some related valid references is
provided, which some cited
inside the report.
achieved result, the importance of the
work, potential market and benefit,
limitations faced, along with some
potential improvement that can be made
to complete the work as potential future
work.
Excellent: An excellent well-structured
reference list containing valid
resources/scholars, which all were cited
inside the report./nAims and Objectives
To develop an ML-based IoT authentication system
that adapts the dynamic environmental conditions and
user behaviors.
To evaluate the system's performances in detecting
anomalies and effectively
effectively adjusting real-time
authentication parameters.
To test the effectiveness of the method in different
IoT contexts.
To carry out a comparative analysis between
traditional techniques and the ML-based
authentication system to identify advancements in
dynamic adaptability.
Potential Solution
A method crucial for gaps in IoT authentication is a
machine learning-driven adaptive authentication model (Al-
Ghuwairi et al., 2023). It is a continuous learning process that
analyzes device interactions, user's behaviours and user's
environmental conditions to adjust authentication levels (Al-
Naji & Zagrouba, 2020). Bharati & Podder (2022) suggests
that the algorithms in the machines such as behaviour analysis
and anomaly plays a crucial role in monitoring user
behaviours to develop accurate profile and enhance the
system's aptitude to adapt to evolving circumstances.
Justification
The method will be effective because it improves the
effectiveness of authentication mechanisms in the face of
emerging and changing threats and user patterns. Besides, it
will be significant due to its ability to offer real-time insights
into authentication contexts, enabling for proactive alterations
(Ge et al., 2021). Therefore, the approach will provide a
context-aware and resilient authentication solution.