Stephen Phillips

Stephen Phillips

Chilworth, England, United Kingdom
500+ connections

About

- Coordinator or work-package leader in many IT research projects; PRINCE2…

Activity

Join now to see all activity

Experience

  • IT Innovation Centre Graphic

    IT Innovation Centre

    Southampton, United Kingdom

  • -

    Southampton, England, United Kingdom

  • -

    Southampton, United Kingdom

  • -

    Southampton, United Kingdom

  • -

    Shanklin, England, United Kingdom

  • -

    Southampton, United Kingdom

  • -

    Southampton, England, United Kingdom

  • -

    Southampton, United Kingdom

  • -

Education

  • University of Southampton Graphic

    University of Southampton

    -

    Developing new algorithms, software and analysis methods to apply digital signal processing techniques to molecular dynamics simulations. Sponsored by SmithKline Beecham.

    My project, "The Computer Simulation of Conformational Change in Biological Molecules" used my computing, mathematical and chemistry skills, including algorithm design, programming and data analysis.

    Won second prize presenting my work as a lecture at the 1999 MGMS Young Modellers' Forum. Won first prize in the…

    Developing new algorithms, software and analysis methods to apply digital signal processing techniques to molecular dynamics simulations. Sponsored by SmithKline Beecham.

    My project, "The Computer Simulation of Conformational Change in Biological Molecules" used my computing, mathematical and chemistry skills, including algorithm design, programming and data analysis.

    Won second prize presenting my work as a lecture at the 1999 MGMS Young Modellers' Forum. Won first prize in the 1999 departmental research poster competition.

    In addition, I developed and managed the group's "Beowulf cluster": a cluster of commodity Linux PCs on shelves connected by high-speed networking and was entrusted with the design and maintenance of the the department's website.

  • -

    Winner of IMA prize for joint best exam performance in final year.

  • -

  • -

Licenses & Certifications

Volunteer Experience

  • Southampton District Coordinator

    Woodcraft Folk

    - 5 years 10 months

    Children

    Coordinating a group of volunteers who organise four groups for young people (aged 5 to 16) in Southampton. Woodcraft Folk is a national cooperative educational movement for young people, teaching them about themselves and the world whilst having fun.

Publications

  • FLAME D7.1: Data Management Action Plan

    FLAME is building a complex multi-stakeholder system with data, including personal data, being collected or generated in many components and needing to be shared to understand the system as a whole. This document explores the issues around managing these datasets including the legal and ethical operating framework (including an extensive analysis of the GDPR), the various different stakeholder types and the contractual arrangements between them, licensing of data, data repositories and the…

    FLAME is building a complex multi-stakeholder system with data, including personal data, being collected or generated in many components and needing to be shared to understand the system as a whole. This document explores the issues around managing these datasets including the legal and ethical operating framework (including an extensive analysis of the GDPR), the various different stakeholder types and the contractual arrangements between them, licensing of data, data repositories and the expected data sets. Finally, a data management template is provided for project trials to promote both good data management and the opening of datasets.

    See publication
  • GRAVITATE: Geometric and Semantic Matching for Cultural Heritage Artefacts

    The Eurographics Association

    The GRAVITATE project is developing techniques that bring together geometric and semantic data analysis to provide a new and more effective method of re-associating, reassembling or reunifying cultural objects that have been broken or dispersed over time. The project is driven by the needs of archaeological institutes, and the techniques are exemplified by their application to a collection of several hundred 3D-scanned fragments of large-scale terracotta statues from Salamis, Cyprus. The…

    The GRAVITATE project is developing techniques that bring together geometric and semantic data analysis to provide a new and more effective method of re-associating, reassembling or reunifying cultural objects that have been broken or dispersed over time. The project is driven by the needs of archaeological institutes, and the techniques are exemplified by their application to a collection of several hundred 3D-scanned fragments of large-scale terracotta statues from Salamis, Cyprus. The integration of geometrical feature extraction and matching with semantic annotation and matching into a single decision support platform will lead to more accurate reconstructions of artefacts and greater insights into history. In this paper we describe the project and its objectives, then we describe the progress made to date towards achieving those objectives: describing the datasets, requirements and analysing the state of the art. We follow this with an overview of the architecture of the integrated decision support platform and the first realisation of the user dashboard. The paper concludes with a description of the continuing work being undertaken to deliver a workable system to cultural heritage curators and researchers.

    See publication
  • 5G-ENSURE D2.2: Trust model (draft)

    Trust is a response to risk. A decision to trust someone (or something) is a decision to accept the risk that they will not perform as expected. To manage risk in a socio-technical system such as a mobile network we need to understand what trust decisions are being made, the consequences of those trust decisions and we need information on the trustworthiness of other parties in order to make better decisions.

    New business models and new domains of operation in 5G networks facilitated by…

    Trust is a response to risk. A decision to trust someone (or something) is a decision to accept the risk that they will not perform as expected. To manage risk in a socio-technical system such as a mobile network we need to understand what trust decisions are being made, the consequences of those trust decisions and we need information on the trustworthiness of other parties in order to make better decisions.

    New business models and new domains of operation in 5G networks facilitated by network function virtualisation and software defined networking bring increased dynamicity compared to 4G and an increase in the number of stakeholders and associated trust relationships. New relationships bring new risks that must be understood and controlled and in a system as complex as 5G this implies the need for a trust model which can model the system, highlight potential risks and demonstrate the effect of adding controls or changing the design.

    See publication
  • Linking quality of service and experience in distributed multimedia systems using PROV semantics

    IEEE 9th International Symposium on Service Oriented System Engineering (SOSE)

    Experimenters creating innovative applications that combine diverse distributed multimedia services with rich end user applications require enhanced insight into the relationships between the perceived quality of experience (QoE) and provided quality of service (QoS). We have implemented software which not only captures QoE and QoS measurements but, using a provenance ontology, also records the interactions between end users, the content, applications and the services. The data exploration…

    Experimenters creating innovative applications that combine diverse distributed multimedia services with rich end user applications require enhanced insight into the relationships between the perceived quality of experience (QoE) and provided quality of service (QoS). We have implemented software which not only captures QoE and QoS measurements but, using a provenance ontology, also records the interactions between end users, the content, applications and the services. The data exploration interface provided allows an experimenter, working with participants in real-world situations, to understand the detail of the participants’ usage and experience of the system and the system performance factors contributing to their quality of experience.

    See publication
  • EXPERIMEDIA: Final blueprint architecture for social and networked media testbeds

    This final blueprint architecture for social and networked media testbeds provides the foundation for the EXPERIMEDIA facility for baseline component development during the sustainability phase (Year 3) and for experiments conducted using the baseline (Year 3) and beyond. The document builds on the second blueprint architecture D2.1.6. The purpose of the architecture is described along with requirement considerations. A high-level description of the EXPERIMEDIA Platform architecture is…

    This final blueprint architecture for social and networked media testbeds provides the foundation for the EXPERIMEDIA facility for baseline component development during the sustainability phase (Year 3) and for experiments conducted using the baseline (Year 3) and beyond. The document builds on the second blueprint architecture D2.1.6. The purpose of the architecture is described along with requirement considerations. A high-level description of the EXPERIMEDIA Platform architecture is provided, how services are delivered and how each component is integrated within experiments for both instrumentation/observation and also orchestration of information flows. The capabilities of each specific component are described including those supporting FMI content lifecycles and the Experiment Content Component supporting overall experiment management.

    See publication
  • Snow White clouds and the Seven Dwarfs

    IEEE

    With increasing availability of Cloud computing services, this paper addresses the challenge consumers of Infrastructure-as-a-Service (IaaS) have in determining which IaaS provider and resources are best suited to run an application that may have specific Quality of Service (QoS) requirements. Utilising application modelling to predict performance is an attractive concept, but is very difficult with the limited information IaaS providers typically provide about the computing resources. This…

    With increasing availability of Cloud computing services, this paper addresses the challenge consumers of Infrastructure-as-a-Service (IaaS) have in determining which IaaS provider and resources are best suited to run an application that may have specific Quality of Service (QoS) requirements. Utilising application modelling to predict performance is an attractive concept, but is very difficult with the limited information IaaS providers typically provide about the computing resources. This paper reports on an initial investigation into using Dwarf benchmarks to measure the performance of virtualised hardware, conducting experiments on BonFIRE and Amazon EC2. The results we obtain demonstrate that labels such as ‘small’, ’medium’, ’large’ or a number of ECUs are not sufficiently informative to predict application performance, as one might expect. Furthermore, knowing the CPU speed, cache size or RAM size is not necessarily sufficient either as other complex factors can lead to significant performance differences. We show that different hardware is better suited for different types of computations and, thus, the relative performance of applications varies across hardware. This is reflected well by Dwarf benchmarks and we show how different applications correlate more strongly with different Dwarfs, leading to the possibility of using Dwarf benchmark scores as parameters in application models.

    See publication
  • Platform-as-a-service architecture for real-time quality of service management in clouds

    Fifth International Conference on Internet and Web Applications and Services (ICIW), 2010

    Cloud computing offers the potential to dramatically reduce the cost of software services through the commoditization of information technology assets and on-demand usage patterns. However, the complexity of determining resource provision policies for applications in such complex environments introduces significant inefficiencies and has driven the emergence of a new class of infrastructure called Platform-as-a-Service (PaaS). In this paper, we present a novel PaaS architecture being developed…

    Cloud computing offers the potential to dramatically reduce the cost of software services through the commoditization of information technology assets and on-demand usage patterns. However, the complexity of determining resource provision policies for applications in such complex environments introduces significant inefficiencies and has driven the emergence of a new class of infrastructure called Platform-as-a-Service (PaaS). In this paper, we present a novel PaaS architecture being developed in the EU IST IRMOS project targeting real-time Quality of Service (QoS) guarantees for online interactive multimedia applications. The architecture considers the full service lifecycle including service engineering, service level agreement design, provisioning and monitoring. QoS parameters at both application and infrastructure levels are given specific attention as the basis for provisioning policies in the context of temporal constraints. The generic applicability of the architecture is being verified and validated through implemented scenarios from three important application sectors (film post-production, virtual augmented reality for engineering design, collaborative e-Learning in virtual worlds).

    See publication
  • Application of the Hilbert-Huang transform to the analysis of molecular dynamics simulations

    Journal of Physical Chemistry A

    The Hilbert-Huang transform (HHT) is a new method for the analysis of nonstationary signals that allows a signal's frequency and amplitude to be evaluated with excellent time resolution. In this paper, the HHT method is described, and its performance is compared with the Fourier methods of spectral analysis. The HHT is then applied to the analysis of molecular dynamics simulation trajectories, including enhanced sampling trajectories produced by reversible digitally filtered molecular dynamics.…

    The Hilbert-Huang transform (HHT) is a new method for the analysis of nonstationary signals that allows a signal's frequency and amplitude to be evaluated with excellent time resolution. In this paper, the HHT method is described, and its performance is compared with the Fourier methods of spectral analysis. The HHT is then applied to the analysis of molecular dynamics simulation trajectories, including enhanced sampling trajectories produced by reversible digitally filtered molecular dynamics. Amplitude-time, amplitude-frequency, and amplitude-frequency-time spectra are all produced with the method and compared to equivalent results obtained using wavelet analysis. The wavelet and HHT analysis yield qualitatively similar results, but the HHT provides a better match to physical intuition than the wavelet transform. Moreover the HHT method is able to show the flow of energy into low-frequency vibrations during conformational change events and is able to identify frequencies appropriate for amplification by digital filters including the observation of a 10 cm(-1) shift in target frequency.

    See publication
  • Digitally filtered molecular dynamics: the frequency specific control of molecular dynamics simulations

    Journal of Chemical Physics

    A new method for modifying the course of a molecular dynamics computer simulation is presented. Digitally filtered molecular dynamics (DFMD) applies the well-established theory of digital filters to molecular dynamics simulations, enabling atomic motion to be enhanced or suppressed in a selective manner solely on the basis of frequency. The basic theory of digital filters and its application to molecular dynamics simulations is presented, together with the application of DFMD to the simple…

    A new method for modifying the course of a molecular dynamics computer simulation is presented. Digitally filtered molecular dynamics (DFMD) applies the well-established theory of digital filters to molecular dynamics simulations, enabling atomic motion to be enhanced or suppressed in a selective manner solely on the basis of frequency. The basic theory of digital filters and its application to molecular dynamics simulations is presented, together with the application of DFMD to the simple systems of single molecules of water and butane. The extension of the basic theory to the condensed phase is then described followed by its application to liquid phase butane and the Syrian hamster prion protein. The high degree of selectivity and control offered by DFMD, and its ability to enhance the rate of conformational change in butane and in the prion protein, is demonstrated.

    See publication

Honors & Awards

  • CWops 2020 Award For Advancing The Art Of CW

    CWops

    The CWops award was given in recognition of the contribution to learning CW (Morse code by radio) made by the tools on my MorseCode.World website.

Languages

  • English

    -

Recommendations received

4 people have recommended Stephen

Join now to view

More activity by Stephen

View Stephen’s full profile

  • See who you know in common
  • Get introduced
  • Contact Stephen directly
Join to view full profile

Explore collaborative articles

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Explore More

Others named Stephen Phillips in United Kingdom

Add new skills with these courses