45 resultados para computer forensics, digital evidence, computer profiling, time-lining, temporal inconsistency, computer forensic object model

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cardiac troponin I (cTnI) is one of the most useful serum marker test for the determination of myocardial infarction (MI). The first commercial assay of cTnI was released for medical use in the United States and Europe in 1995. It is useful in determining if the source of chest pains, whose etiology may be unknown, is cardiac related. Cardiac TnI is released into the bloodstream following myocardial necrosis (cardiac cell death) as a result of an infarct (heart attack). In this research project the utility of cardiac troponin I as a potential marker for the determination of time of death is investigated. The approach of this research is not to investigate cTnI degradation in serum/plasma, but to investigate the proteolytic breakdown of this protein in heart tissue postmortem. If our hypothesis is correct, cTnI might show a distinctive temporal degradation profile after death. This temporal profile may have potential as a time of death marker in forensic medicine. The field of time of death markers has lagged behind the great advances in technology since the late 1850's. Today medical examiners are using rudimentary time of death markers that offer limited reliability in the medico-legal arena. Cardiac TnI must be stabilized in order to avoid further degradation by proteases in the extraction process. Chemically derivatized magnetic microparticles were covalently linked to anti-cTnI monoclonal antibodies. A charge capture approach was also used to eliminate the antibody from the magnetic microparticles given the negative charge on the microparticles. The magnetic microparticles were used to extract cTnI from heart tissue homogenate for further bio-analysis. Cardiac TnI was eluted from the beads with a buffer and analyzed. This technique exploits banding pattern on sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) followed by a western blot transfer to polyvinylidene fluoride (PVDF) paper for probing with anti-cTnI monoclonal antibodies. Bovine hearts were used as a model to establish the relationship of time of death and concentration/band-pattern given its homology to human cardiac TnI. The final concept feasibility was tested with human heart samples from cadavers with known time of death. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to investigate teachers' espoused instructional beliefs and whether they differed in relation to schools' socioeconomic status, extent of teachers' educational background, or extent of teachers' classroom experience. The study comprised a total of 242 Miami-Dade County public school educators who responded to a thirty-nine question Likert scale, Literacy Instructional Practices Questionnaire. Eighteen schools, three from each of the six regions, were purposively selected based on the socioeconomic status of students. Nine participants were interviewed using semi-standardized interview procedures and open-ended questioning techniques. ^ Multivariate Analysis of Variance (MANOVA) results revealed that teachers' espoused beliefs concerning the instruction of literacy and forces and influences affecting instruction do not significantly differ depending on schools' socioeconomic status, extent of teachers' educational background, or extent of teachers' classroom experience. The majority of teachers appear to follow a top-down generated direct instruction model. Generally, students are taught as a whole class and ability grouped for specific skill instruction utilizing commercially produced reading and language arts texts. ^ There was no evidence of a relationship between teachers' espoused beliefs concerning the model of instruction that they practice or teachers' espoused beliefs concerning research and its application to practice and the three independent variables. Interview data corroborated much of the information garnered through the questionnaire. However, interview participants espoused the belief that research did not influence their selection of instructional practices. ^ Although teachers perceive of themselves as eclectic in their espoused instructional beliefs, they appear to follow a skills based direct instruction pedagogy in practice. Much of what teachers believe constitutes effective practice, few researchers recommend, affirming the findings of Calderhead (1993) and the National Educational Research Policy and Priorities Board (U.S. Department of Education, 1998, p. 18) that “educators rarely know research, seek it out, or act in accordance with its results.” ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predation risk influences a variety of behavioral decisions of many organisms and results in animals having to trade-offs safety with other behaviors. The effects of predation, however, have been largely ignored in the study of vertebrates that forage underwater (divers). I tested the predictions of an on optimal diving model that incorporates the risk of predation, using red eared slider turtles (Trachemys scripta elegans). Specifically, I tested the hypothesis that divers will increase their surface time when instantaneous risk decreases with time at the surface. By using a model aerial predator and exposing turtles to both risk and no risk treatments, I tested how turtles perceive risk at the surface and whether they increase or decrease their surface time depending on how they assess risk. The model's predictions for situations in which risk at the surface is decreasing with time spent there-likely to be the case for aerial predation-were supported by the results. I found that surface time and time spent submerged per dive were significantly greater when turtles were at risk and that turtles also spent more time resting at the bottom when exposed to this treatment. Interestingly, turtles under risk engaged in vigilance behaviors while on the bottom just prior to surfacing. This behavior could have implications for model predictions and future experiments are needed to test whether subsurface vigilance may alter diving decisions made under risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to investigate teachers’ espoused instructional beliefs and whether they differed in relation to schools’ socioeconomic status, extent of teachers’ educational background, or extent of teachers’ classroom experience. The study comprised a total of 242 Miami-Dade County public school educators who responded to a thirty-nine question Likert scale, Literacy Instructional Practices Questionnaire. Eighteen schools, three from each of the six regions, were purposively selected based on the socioeconomic status of students. Nine participants were interviewed using semi-standardized interview procedures and open-ended questioning techniques. Multivariate Analysis of Variance (MANOVA) results revealed that teachers’ espoused beliefs concerning the instruction of literacy and forces and influences affecting instruction do not significantly differ depending on schools’ socioeconomic status, extent of teachers’ educational background, or extent of teachers’ classroom experience. The majority of teachers appear to follow a top-down generated direct instruction model. Generally, students are taught as a whole class and ability grouped for specific skill instruction utilizing commercially produced reading and language arts texts. There was no evidence of a relationship between teachers’ espoused beliefs concerning the model of instruction that they practice or teachers’ espoused beliefs concerning research and its application to practice and the three independent variables. Interview data corroborated much of the information garnered through the questionnaire. However, interview participants espoused the belief that research did not influence their selection of instructional practices. Although teachers perceive of themselves as eclectic in their espoused instructional beliefs, they appear to follow a skills based direct instruction pedagogy in practice. Much of what teachers believe constitutes effective practice, few researchers recommend, affirming the findings of Calderhead (1993) and the National Educational Research Policy and Priorities Board (U.S. Department of Education, 1998, p. 18) that “educators rarely know research, seek it out, or act in accordance with its results.”

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With evidence of increasing hurricane risks in Georgia Coastal Area (GCA) and Virginia in the U.S. Southeast and elsewhere, understanding intended evacuation behavior is becoming more and more important for community planners. My research investigates intended evacuation behavior due to hurricane risks, a behavioral survey of the six counties in GCA under the direction of two social scientists with extensive experience in survey research related to citizen and household response to emergencies and disasters. Respondents gave answers whether they would evacuate under both voluntary and mandatory evacuation orders. Bivariate probit models are used to investigate the subjective belief structure of whether or not the respondents are concerned about the hurricane, and the intended probability of evacuating as a function of risk perception, and a lot of demographic and socioeconomic variables (e.g., gender, military, age, length of residence, owning vehicles).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Furthering the understanding of and exploring the literature on moral leadership models is the purpose of this research paper. A review the literature was undertaken, synthesizing concepts and offering a new paradigm for educational leaders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This mixed-methods study examined effects of a staff development model on instructional practices and dispositions of P-12 teachers. The model design was guided by participants’ varying developmental levels and their values and beliefs about teaching and learning. The study adds to our understanding of the need for teacher-centered professional development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hospitality programs in the United States are continually undergoing curriculum review to stay current and to produce graduates who will excel in the industry. This article describes the revision process used by one university.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While absenteeism models have been developed and applied in the manufacturing industries, little work has been done on absenteeism in service industries. Due to the labor intensity of service industries, specifically the hotel industry, a model to track and quantify the costs of absenteeism could be useful to managers. The authors propose just such a model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While studies on metazoan cell proliferation, cell differentiation, and cytokine signaling laid the foundation of the current paradigms of tyrosine kinase signaling, similar studies using lower eukaryotes have provided invaluable insight for the understanding of mammalian pathways, such as Wnt and STAT pathways. Dictyostelium is one of the leading lower eukaryotic model systems where stress-induced cellular responses, Wnt-like pathways, and STAT-mediated pathways are well investigated. TheseDictyostelium pathways will be reviewed together with their mammalian counterparts to facilitate the comparative understanding of these variant and noncanonical pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand. Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century.