903 resultados para Bayesian inference, Behaviour analysis, Security, Visual surveillance


Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

How do local homeland security organizations respond to catastrophic events such as hurricanes and acts of terrorism? Among the most important aspects of this response are these organizations ability to adapt to the uncertain nature of these "focusing events" (Birkland 1997). They are often behind the curve, seeing response as a linear process, when in fact it is a complex, multifaceted process that requires understanding the interactions between the fiscal pressures facing local governments, the institutional pressures of working within a new regulatory framework and the political pressures of bringing together different levels of government with different perspectives and agendas. ^ This dissertation has focused on tracing the factors affecting the individuals and institutions planning, preparing, responding and recovering from natural and man-made disasters. Using social network analysis, my study analyzes the interactions between the individuals and institutions that respond to these "focusing events." In practice, it is the combination of budgetary, institutional, and political pressures or constraints interacting with each other which resembles a Complex Adaptive System (CAS). ^ To investigate this system, my study evaluates the evolution of two separate sets of organizations composed of first responders (Fire Chiefs, Emergency Management Coordinators) and community volunteers organized in the state of Florida over the last fifteen years. Using a social network analysis approach, my dissertation analyzes the interactions between Citizen Corps Councils (CCCs) and Community Emergency Response Teams (CERTs) in the state of Florida from 1996–2011. It is the pattern of interconnections that occur over time that are the focus of this study. ^ The social network analysis revealed an increase in the amount and density of connections between these organizations over the last fifteen years. The analysis also exposed the underlying patterns in these connections; that as the networks became more complex they also became more decentralized though not in any uniform manner. The present study brings to light a story of how communities have adapted to the ever changing circumstances that are sine qua non of natural and man-made disasters.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I–IV), and ‘key genes’ within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96–100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential ‘hubs of activity’. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several ‘key genes’ may be required for the development of glioblastoma. Further studies are needed to validate these ‘key genes’ as useful tools for early detection and novel therapeutic options for these tumors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The organizational authority of the Papacy in the Roman Catholic Church and the permanent membership of the UN Security Council are unique from institutions that are commonly compared with the UN, like the Concert of Europe and the League of Nations, in that these institutional organs possessed strong authoritative and veto powers. Both organs also owe their strong authority during their founding to a need for stability: The Papacy after the crippling of Western Roman Empire and the P-5 to deal with the insecurities of the post-WWII world. While the P-5 still possesses similar authoritative powers within the Council as it did after WWII, the historical authoritative powers of the Papacy within the Church was debilitated to such a degree that by the time of the Reformation in Europe, condemnations of practices within the Church itself were not effective. This paper will analyze major challenges to the authoritative powers of the Papacy, from the crowning of Charlemagne to the beginning of the Reformation, and compare the analysis to challenges affecting the authoritative powers of the P-5 since its creation. From research conducted thus far, I hypothesize that common themes affecting the authoritative powers of the P-5 and the Papacy would include: major changes in the institutions organization (i.e. the Avignon Papacy and Japan’s bid to become a permanent member); the decline in power of actors supporting the institutional organ (i.e. the Holy Roman Empire and the P-5 members); and ideological clashes affecting the institution’s normative power (i.e. the Great Western Schism and Cold War politics).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The organizational authority of the Papacy in the Roman Catholic Church and the permanent membership of the UN Security Council are unique from institutions that are commonly compared with the UN, like the Concert of Europe and the League of Nations, in that these institutional organs possessed strong authoritative and veto powers. Both organs also owe their strong authority during their founding to a need for stability: The Papacy after the crippling of Western Roman Empire and the P-5 to deal with the insecurities of the post-WWII world. While the P-5 still possesses similar authoritative powers within the Council as it did after WWII, the historical authoritative powers of the Papacy within the Church was debilitated to such a degree that by the time of the Reformation in Europe, condemnations of practices within the Church itself were not effective. This paper will analyze major challenges to the authoritative powers of the Papacy, from the crowning of Charlemagne to the beginning of the Reformation, and compare the analysis to challenges affecting the authoritative powers of the P-5 since its creation. From research conducted thus far, I hypothesize that common themes affecting the authoritative powers of the P-5 and the Papacy would include: major changes in the institutions organization (i.e. the Avignon Papacy and Japan’s bid to become a permanent member); the decline in power of actors supporting the institutional organ (i.e. the Holy Roman Empire and the P-5 members); and ideological clashes affecting the institution’s normative power (i.e. the Great Western Schism and Cold War politics).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We are grateful for the co-operation and assistance that we received from NHS staff in the co-ordinating centres and clinical sites. We thank the women who participated in TOMBOLA. The TOMBOLA trial was supported by the Medical Research Council (G9700808) and the NHS in England and Scotland. The TOMBOLA Group comprises the following: Grant-holders: University of Aberdeen and NHS Grampian, Aberdeen, Scotland: Maggie Cruickshank, Graeme Murray, David Parkin, Louise Smart, Eric Walker, Norman Waugh (Principal Investigator 2004–2008) University of Nottingham and Nottingham NHS, Nottingham, England: Mark Avis, Claire Chilvers, Katherine Fielding, Rob Hammond, David Jenkins, Jane Johnson, Keith Neal, Ian Russell, Rashmi Seth, Dave Whynes University of Dundee and NHS Tayside, Dundee, Tayside: Ian Duncan, Alistair Robertson (deceased) University of Ottawa, Ottawa, Canada: Julian Little (Principal Investigator 1999–2004) National Cancer Registry, Cork, Ireland: Linda Sharp Bangor University, Bangor, Wales: Ian Russell University of Hull, Hull, England: Leslie G Walker Staff in clinical sites and co-ordinating centres Grampian Breda Anthony, Sarah Bell, Adrienne Bowie, Katrina Brown (deceased), Joe Brown, Kheng Chew, Claire Cochran, Seonaidh Cotton, Jeannie Dean, Kate Dunn, Jane Edwards, David Evans, Julie Fenty, Al Finlayson, Marie Gallagher, Nicola Gray, Maureen Heddle, Alison Innes, Debbie Jobson, Mandy Keillor, Jayne MacGregor, Sheona Mackenzie, Amanda Mackie, Gladys McPherson, Ike Okorocha, Morag Reilly, Joan Rodgers, Alison Thornton, Rachel Yeats Tayside Lindyanne Alexander, Lindsey Buchanan, Susan Henderson, Tine Iterbeke, Susanneke Lucas, Gillian Manderson, Sheila Nicol, Gael Reid, Carol Robinson, Trish Sandilands Nottingham Marg Adrian, Ahmed Al-Sahab, Elaine Bentley, Hazel Brook, Claire Bushby, Rita Cannon, Brenda Cooper, Ruth Dowell, Mark Dunderdale, Dr Gabrawi, Li Guo, Lisa Heideman, Steve Jones, Salli Lawson, Zoë Philips, Christopher Platt, Shakuntala Prabhakaran, John Rippin, Rose Thompson, Elizabeth Williams, Claire Woolley Statistical analysis Seonaidh Cotton, Kirsten Harrild, John Norrie, Linda Sharp External Trial Steering Committee Nicholas Day (chair, 1999–2004), Theresa Marteau (chair 2004-), Mahesh Parmar, Julietta Patnick and Ciaran Woodman.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Primary hyperparathyroidism (PHPT) is a common endocrine neoplastic disorder caused by a failure of calcium sensing secondary to tumour development in one or more of the parathyroid glands. Parathyroid adenomas are comprised of distinct cellular subpopulations of variable clonal status that exhibit differing degrees of calcium responsiveness. To gain a clearer understanding of the relationship among cellular identity, tumour composition and clinical biochemistry in PHPT, we developed a novel single cell platform for quantitative evaluation of calcium sensing behaviour in freshly resected human parathyroid tumour cells. Live-cell intracellular calcium flux was visualized through Fluo-4-AM epifluorescence, followed by in situ immunofluorescence detection of the calcium sensing receptor (CASR), a central component in the extracellular calcium signalling pathway. The reactivity of individual parathyroid tumour cells to extracellular calcium stimulus was highly variable, with discrete kinetic response patterns observed both between and among parathyroid tumour samples. CASR abundance was not an obligate determinant of calcium responsiveness. Calcium EC50 values from a series of parathyroid adenomas revealed that the tumours segregated into two distinct categories. One group manifested a mean EC50 of 2.40 mM (95% CI: 2.37-2.41), closely aligned to the established normal range. The second group was less responsive to calcium stimulus, with a mean EC50 of 3.61 mM (95% CI: 3.45-3.95). This binary distribution indicates the existence of a previously unappreciated biochemical sub-classification of PHPT tumours, possibly reflecting distinct etiological mechanisms. Recognition of quantitative differences in calcium sensing could have important implications for the clinical management of PHPT.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.

This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.

The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new

individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the

refreshment sample itself. As we illustrate, nonignorable unit nonresponse

can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse

in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.

The second method incorporates informative prior beliefs about

marginal probabilities into Bayesian latent class models for categorical data.

The basic idea is to append synthetic observations to the original data such that

(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.

We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.

The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.

Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.

Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with

little or no prior knowledge

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Changes in olfactory-mediated behaviour caused by elevated CO2 levels in the ocean could affect recruitment to reef fish populations because larval fish become more vulnerable to predation. However, it is currently unclear how elevated CO2 will impact the other key part of the predator-prey interaction - the predators. We investigated the effects of elevated CO2 and reduced pH on olfactory preferences, activity levels and feeding behaviour of a common coral reef meso-predator, the brown dottyback (Pseudochromis fuscus). Predators were exposed to either current-day CO2 levels or one of two elevated CO2 levels (~600 µatm or ~950 µatm) that may occur by 2100 according to climate change predictions. Exposure to elevated CO2 and reduced pH caused a shift from preference to avoidance of the smell of injured prey, with CO2treated predators spending approximately 20% less time in a water stream containing prey odour compared with controls. Furthermore, activity levels of fish was higher in the high CO2 treatment and feeding activity was lower for fish in the mid CO2treatment; indicating that future conditions may potentially reduce the ability of the fish to respond rapidly to fluctuations in food availability. Elevated activity levels of predators in the high CO2 treatment, however, may compensate for reduced olfactory ability, as greater movement facilitated visual detection of food. Our findings show that, at least for the species tested to date, both parties in the predator-prey relationship may be affected by ocean acidification. Although impairment of olfactory-mediated behaviour of predators might reduce the risk of predation for larval fishes, the magnitude of the observed effects of elevated CO2 acidification appear to be more dramatic for prey compared to predators. Thus, it is unlikely that the altered behaviour of predators is sufficient to fully compensate for the effects of ocean acidification on prey mortality.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The police use both subjective (i.e. police staff) and automated (e.g. face recognition systems) methods for the completion of visual tasks (e.g person identification). Image quality for police tasks has been defined as the image usefulness, or image suitability of the visual material to satisfy a visual task. It is not necessarily affected by any artefact that may affect the visual image quality (i.e. decrease fidelity), as long as these artefacts do not affect the relevant useful information for the task. The capture of useful information will be affected by the unconstrained conditions commonly encountered by CCTV systems such as variations in illumination and high compression levels. The main aim of this thesis is to investigate aspects of image quality and video compression that may affect the completion of police visual tasks/applications with respect to CCTV imagery. This is accomplished by investigating 3 specific police areas/tasks utilising: 1) the human visual system (HVS) for a face recognition task, 2) automated face recognition systems, and 3) automated human detection systems. These systems (HVS and automated) were assessed with defined scene content properties, and video compression, i.e. H.264/MPEG-4 AVC. The performance of imaging systems/processes (e.g. subjective investigations, performance of compression algorithms) are affected by scene content properties. No other investigation has been identified that takes into consideration scene content properties to the same extend. Results have shown that the HVS is more sensitive to compression effects in comparison to the automated systems. In automated face recognition systems, `mixed lightness' scenes were the most affected and `low lightness' scenes were the least affected by compression. In contrast the HVS for the face recognition task, `low lightness' scenes were the most affected and `medium lightness' scenes the least affected. For the automated human detection systems, `close distance' and `run approach' are some of the most commonly affected scenes. Findings have the potential to broaden the methods used for testing imaging systems for security applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concept of ontological security has a remarkable echo in the current sociology to describe emotional status of men of late modernity. However, the concept created by Giddens in the eighties has been little used in empirical research covering various sources of risk or uncertainty. In this paper, a scale for ontological security is proposed. To do this, we start from the results of a research focused on the relationship between risk, uncertainty and vulnerability in the context of the economic crisis in Spain. These results were produced through nine focus groups and a telephone survey with standardized questionnaire applied to a national sample of 2,408 individuals over 18 years. This work is divided into three main sections. In the fi rst, a scale has been built from the results of the application of different items present in the questionnaire used. The second part explores the relationships of the scale obtained with the variables further approximate the emotional dimensions of individuals. The third part observes the variables that contribute to changes in the scale: These variables show the structural feature of the ontological security.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cyber-physical systems tightly integrate physical processes and information and communication technologies. As today’s critical infrastructures, e.g., the power grid or water distribution networks, are complex cyber-physical systems, ensuring their safety and security becomes of paramount importance. Traditional safety analysis methods, such as HAZOP, are ill-suited to assess these systems. Furthermore, cybersecurity vulnerabilities are often not considered critical, because their effects on the physical processes are not fully understood. In this work, we present STPA-SafeSec, a novel analysis methodology for both safety and security. Its results show the dependencies between cybersecurity vulnerabilities and system safety. Using this information, the most effective mitigation strategies to ensure safety and security of the system can be readily identified. We apply STPA-SafeSec to a use case in the power grid domain, and highlight its benefits.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Taphonomic research of bones can provide additional insight into a site's formation and development, the burial environment and ongoing post-mortem processes. A total of 30 tortoise (Cylindraspis) femur bone samples from the Mare aux Songes site (Mauritius)were studied histologically, assessing parameters such as presence and type of microbial alteration, inclusions, staining/infiltrations, the degree of microcracking and birefringence. The absence of microbial attack in the 4200 year old Mare aux Songes bones suggests the animals rapidly entered the soil whole-bodied and were sealed anoxically, although they suffered frombiological and chemical degradation (i.e. pyrite formation/oxidation, mineral dissolution and staining) related to changes in the site's hydrology. Additionally, carbon and nitrogen stable isotopeswere analysed to obtain information on the animals' feeding behaviour. The results show narrowly distributed δ13C ratios, indicating a terrestrial C3 plant-based diet, combined with a wide range in δ15N ratios. This is most likely related to the tortoises' drought-adaptive ability to change their metabolic processes, which can affect the δ15N ratios. Furthermore, ZooMS collagen fingerprinting analysis successfully identified two tortoise species (C. triserrata and C. inepta) in the bone assemblage,which,when combined with stable isotope data, revealed significantly different δ15N ratios between the two tortoise species. As climatic changes around this period resulted in increased aridity in the Mascarene Islands, this could explain the extremely elevated δ15N ratio in our dataset. The endemic fauna was able to endure the climatic changes 4200 years ago, although human arrival in the 17th century changed the original habitat to such an extent that it resulted in the extinction of several species. Fortunately we are still able to study these extinct tortoises due to the beneficial conditions of their burial environment, resulting in excellent bone preservation.