940 resultados para Data Standards
Resumo:
We report three developments toward resolving the challenge of the apparent basal polytomy of neoavian birds. First, we describe improved conditional down-weighting techniques to reduce noise relative to signal for deeper divergences and find increased agreement between data sets. Second, we present formulae for calculating the probabilities of finding predefined groupings in the optimal tree. Finally, we report a significant increase in data: nine new mitochondrial (mt) genomes (the dollarbird, New Zealand kingfisher, great potoo, Australian owlet-nightjar, white-tailed trogon, barn owl, a roadrunner [a ground cuckoo], New Zealand long-tailed cuckoo, and the peach-faced lovebird) and together they provide data for each of the six main groups of Neoaves proposed by Cracraft J (2001). We use his six main groups of modern birds as priors for evaluation of results. These include passerines, cuckoos, parrots, and three other groups termed “WoodKing” (woodpeckers/rollers/kingfishers), “SCA” (owls/potoos/owlet-nightjars/hummingbirds/swifts), and “Conglomerati.” In general, the support is highly significant with just two exceptions, the owls move from the “SCA” group to the raptors, particularly accipitrids (buzzards/eagles) and the osprey, and the shorebirds may be an independent group from the rest of the “Conglomerati”. Molecular dating mt genomes support a major diversification of at least 12 neoavian lineages in the Late Cretaceous. Our results form a basis for further testing with both nuclear-coding sequences and rare genomic changes.
Resumo:
Mandatory data breach notification laws have been a significant legislative reform in response to unauthorized disclosures of personal information by public and private sector organizations. These laws originated in the state-based legislatures of the United States during the last decade and have subsequently garnered worldwide legislative interest. We contend that there are conceptual and practical concerns regarding mandatory data breach notification laws which limit the scope of their applicability, particularly in relation to existing information privacy law regimes. We outline these concerns here, in the light of recent European Union and Australian legal developments in this area.
Resumo:
In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.
Resumo:
The National Morbidity, Mortality, and Air Pollution Study (NMMAPS) was designed to examine the health effects of air pollution in the United States. The primary question was whether particulate matter was responsible for the associations between air pollution and daily mortality. Secondary questions concerned measurement error in air pollution and mortality displacement.1 Since then, NMMAPS has been used to answer many important questions in environmental epidemiology...
Resumo:
Monitoring environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; online collaboration, manual, automatic and human-in-the loop analysis.
Resumo:
In order to support intelligent transportation system (ITS) road safety applications such as collision avoidance, lane departure warnings and lane keeping, Global Navigation Satellite Systems (GNSS) based vehicle positioning system has to provide lane-level (0.5 to 1 m) or even in-lane-level (0.1 to 0.3 m) accurate and reliable positioning information to vehicle users. However, current vehicle navigation systems equipped with a single frequency GPS receiver can only provide road-level accuracy at 5-10 meters. The positioning accuracy can be improved to sub-meter or higher with the augmented GNSS techniques such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP) which have been traditionally used in land surveying and or in slowly moving environment. In these techniques, GNSS corrections data generated from a local or regional or global network of GNSS ground stations are broadcast to the users via various communication data links, mostly 3G cellular networks and communication satellites. This research aimed to investigate the precise positioning system performances when operating in the high mobility environments. This involves evaluation of the performances of both RTK and PPP techniques using: i) the state-of-art dual frequency GPS receiver; and ii) low-cost single frequency GNSS receiver. Additionally, this research evaluates the effectiveness of several operational strategies in reducing the load on data communication networks due to correction data transmission, which may be problematic for the future wide-area ITS services deployment. These strategies include the use of different data transmission protocols, different correction data format standards, and correction data transmission at the less-frequent interval. A series of field experiments were designed and conducted for each research task. Firstly, the performances of RTK and PPP techniques were evaluated in both static and kinematic (highway with speed exceed 80km) experiments. RTK solutions achieved the RMS precision of 0.09 to 0.2 meter accuracy in static and 0.2 to 0.3 meter in kinematic tests, while PPP reported 0.5 to 1.5 meters in static and 1 to 1.8 meter in kinematic tests by using the RTKlib software. These RMS precision values could be further improved if the better RTK and PPP algorithms are adopted. The tests results also showed that RTK may be more suitable in the lane-level accuracy vehicle positioning. The professional grade (dual frequency) and mass-market grade (single frequency) GNSS receivers were tested for their performance using RTK in static and kinematic modes. The analysis has shown that mass-market grade receivers provide the good solution continuity, although the overall positioning accuracy is worse than the professional grade receivers. In an attempt to reduce the load on data communication network, we firstly evaluate the use of different correction data format standards, namely RTCM version 2.x and RTCM version 3.0 format. A 24 hours transmission test was conducted to compare the network throughput. The results have shown that 66% of network throughput reduction can be achieved by using the newer RTCM version 3.0, comparing to the older RTCM version 2.x format. Secondly, experiments were conducted to examine the use of two data transmission protocols, TCP and UDP, for correction data transmission through the Telstra 3G cellular network. The performance of each transmission method was analysed in terms of packet transmission latency, packet dropout, packet throughput, packet retransmission rate etc. The overall network throughput and latency of UDP data transmission are 76.5% and 83.6% of TCP data transmission, while the overall accuracy of positioning solutions remains in the same level. Additionally, due to the nature of UDP transmission, it is also found that 0.17% of UDP packets were lost during the kinematic tests, but this loss doesn't lead to significant reduction of the quality of positioning results. The experimental results from the static and the kinematic field tests have also shown that the mobile network communication may be blocked for a couple of seconds, but the positioning solutions can be kept at the required accuracy level by setting of the Age of Differential. Finally, we investigate the effects of using less-frequent correction data (transmitted at 1, 5, 10, 15, 20, 30 and 60 seconds interval) on the precise positioning system. As the time interval increasing, the percentage of ambiguity fixed solutions gradually decreases, while the positioning error increases from 0.1 to 0.5 meter. The results showed the position accuracy could still be kept at the in-lane-level (0.1 to 0.3 m) when using up to 20 seconds interval correction data transmission.
Resumo:
Distraction whilst driving on an approach to a signalized intersection is particularly dangerous, as potential vehicular conflicts and resulting angle collisions tend to be severe. This study examines the decisions of distracted drivers during the onset of amber lights. Driving simulator data were obtained from a sample of 58 drivers under baseline and handheld mobile phone conditions at the University of IOWA - National Advanced Driving Simulator. Explanatory variables include age, gender, cell phone use, distance to stop-line, and speed. An iterative combination of decision tree and logistic regression analyses are employed to identify main effects, non-linearities, and interactions effects. Results show that novice (16-17 years) and younger (18-25 years) drivers’ had heightened amber light running risk while distracted by cell phone, and speed and distance thresholds yielded significant interaction effects. Driver experience captured by age has a multiplicative effect with distraction, making the combined effect of being inexperienced and distracted particularly risky. Solutions are needed to combat the use of mobile phones whilst driving.
Resumo:
This study evaluated the effect of eye muscle area (EMA), ossification, carcass weight, marbling and rib fat depth on the incidence of dark cutting (pH u > 5.7) using routinely collected Meat Standards Australia (MSA) data. Data was obtained from 204,072 carcasses at a Western Australian processor between 2002 and 2008. Binomial data of pH u compliance was analysed using a logit model in a Bayesian framework. Increasing eye muscle area from 40 to 80 cm 2, increased pH u compliance by around 14% (P < 0.001) in carcasses less than 350 kg. As carcass weight increased from 150 kg to 220 kg, compliance increased by 13% (P < 0.001) and younger cattle with lower ossification were also 7% more compliant (P < 0.001). As rib fat depth increased from 0 to 20 mm, pH u compliance increased by around 10% (P < 0.001) yet marbling had no effect on dark cutting. Increasing musculature and growth combined with good nutrition will minimise dark cutting beef in Australia.
Resumo:
Most approaches to business process compliance are restricted to the analysis of the structure of processes. It has been argued that full regulatory compliance requires information on not only the structure of processes but also on what the tasks in a process do. To this end Governatori and Sadiq[2007] proposed to extend business processes with semantic annotations. We propose a methodology to automatically extract one kind of such annotations; in particular the annotations related to the data schema and templates linked to the various tasks in a business process.
Resumo:
The quadrupole coupling constants (qcc) for39K and23Na ions in glycerol have been calculated from linewidths measured as a function of temperature (which in turn results in changes in solution viscosity). The qcc of39K in glycerol is found to be 1.7 MHz, and that of23Na is 1.6 MHz. The relaxation behavior of39K and23Na ions in glycerol shows magnetic field and temperature dependence consistent with the equations for transverse relaxation more commonly used to describe the reorientation of nuclei in a molecular framework with intramolecular field gradients. It is shown, however, that τc is not simply proportional to the ratio of viscosity/temperature (ηT). The 39K qcc in glycerol and the value of 1.3 MHz estimated for this nucleus in aqueous solution are much greater than values of 0.075 to 0.12 MHz calculated from T2 measurements of39K in freshly excised rat tissues. This indicates that, in biological samples, processes such as exchange of potassium between intracellular compartments or diffusion of ions through locally ordered regions play a significant role in determining the effective quadrupole coupling constant and correlation time governing39K relaxation. T1 and T2 measurements of rat muscle at two magnetic fields also indicate that a more complex correlation function may be required to describe the relaxation of39K in tissue. Similar results and conclusions are found for23Na.
Resumo:
The skyrocketing trend for social media on the Internet greatly alters analytical Customer Relationship Management (CRM). Against this backdrop, the purpose of this paper is to advance the conceptual design of Business Intelligence (BI) systems with data identified from social networks. We develop an integrated social network data model, based on an in-depth analysis of Facebook. The data model can inform the design of data warehouses in order to offer new opportunities for CRM analyses, leading to a more consistent and richer picture of customers? characteristics, needs, wants, and demands. Four major contributions are offered. First, Social CRM and Social BI are introduced as emerging fields of research. Second, we develop a conceptual data model to identify and systematize the data available on online social networks. Third, based on the identified data, we design a multidimensional data model as an early contribution to the conceptual design of Social BI systems and demonstrate its application by developing management reports in a retail scenario. Fourth, intellectual challenges for advancing Social CRM and Social BI are discussed.
Resumo:
Objectives:Despite many years of research, there is currently no treatment available that results in major neurological or functional recovery after traumatic spinal cord injury (tSCI). In particular, no conclusive data related to the role of the timing of decompressive surgery, and the impact of injury severity on its benefit, have been published to date. This paper presents a protocol that was designed to examine the hypothesized association between the timing of surgical decompression and the extent of neurological recovery in tSCI patients.Study design: The SCI-POEM study is a Prospective, Observational European Multicenter comparative cohort study. This study compares acute (<12 h) versus non-acute (>12 h, <2 weeks) decompressive surgery in patients with a traumatic spinal column injury and concomitant spinal cord injury. The sample size calculation was based on a representative European patient cohort of 492 tSCI patients. During a 4-year period, 300 patients will need to be enrolled from 10 trauma centers across Europe. The primary endpoint is lower-extremity motor score as assessed according to the 'International standards for neurological classification of SCI' at 12 months after injury. Secondary endpoints include motor, sensory, imaging and functional outcomes at 3, 6 and 12 months after injury.Conclusion:In order to minimize bias and reduce the impact of confounders, special attention is paid to key methodological principles in this study protocol. A significant difference in safety and/or efficacy endpoints will provide meaningful information to clinicians, as this would confirm the hypothesis that rapid referral to and treatment in specialized centers result in important improvements in tSCI patients.Spinal Cord advance online publication, 17 April 2012; doi:10.1038/sc.2012.34.
Resumo:
The development of the Learning and Teaching Academic Standards Statement for Architecture (the Statement) centred on requirements for the Master of Architecture and proceeded alongside similar developments in the building and construction discipline under the guidance and support of the Australian Deans of Built Environment and Design (ADBED). Through their representation of Australian architecture programs, ADBED have provided high-level leadership for the Learning and Teaching Academic Standards Project in Architecture (LTAS Architecture). The threshold learning outcomes (TLOs), the description of the nature and extent of the discipline, and accompanying notes were developed through wide consultation with the discipline and profession nationally. They have been considered and debated by ADBED on a number of occasions and have, in their fi nal form, been strongly endorsed by the Deans. ADBED formed the core of the Architecture Reference Group (chaired by an ADBED member) that drew together representatives of every peak organisation for the profession and discipline in Australia. The views of the architectural education community and profession have been provided both through individual submissions and the voices of a number of peak bodies. Over two hundred individuals from the practising profession, the academic workforce and the student cohort have worked together to build consensus about the capabilities expected of a graduate of an Australian Master of Architecture degree. It was critical from the outset that the Statement should embrace the wisdom of the greater ‘tribe’, should ensure that graduates of the Australian Master of Architecture were eligible for professional registration and, at the same time, should allow for scope and diversity in the shape of Australian architectural education. A consultation strategy adopted by the Discipline Scholar involved meetings and workshops in Perth, Melbourne, Sydney, Canberra and Brisbane. Stakeholders from all jurisdictions and most universities participated in the early phases of consultation through a series of workshops that concluded late in October 2010. The Draft Architecture Standards Statement was formed from these early meetings and consultation in respect of that document continued through early 2011. This publication represents the outcomes of work to establish an agreed standards statement for the Master of Architecture. Significant further work remains to ensure the alignment of professional accreditation and recognition procedures with emerging regulatory frameworks cascading from the establishment of the Tertiary Education Quality and Standards Agency (TEQSA). The Australian architecture community hopes that mechanisms can be found to integrate TEQSA’s quality assurance purpose with well-established and understood systems of professional accreditation to ensure the good standing of Australian architectural education into the future. The work to build renewed and integrated quality assurance processes and to foster the interests of this project will continue, for at least the next eighteen months, under the auspices of Australian Learning and Teaching Council (ALTC)-funded Architecture Discipline Network (ADN), led by ADBED and Queensland University of Technology. The Discipline Scholar gratefully acknowledges the generous contributions given by those in stakeholder communities to the formulation of the Statement. Professional and academic colleagues have travelled and gathered to shape the Standards Statement. Debate has been vigorous and spirited and the Statement is rich with the purpose, critical thinking and good judgement of the Australian architectural education community. The commitments made to the processes that have produced this Statement reflect a deep and abiding interest by the constituency in architectural education. This commitment bodes well for the vibrancy and productivity of the emergent Architecture Discipline Network (ADN). Endorsement, in writing, was received from the Australian Institute of Architects National Education Committee (AIA NEC): The National Education Committee (NEC) of the Australian Institute of Architects thank you for your work thus far in developing the Learning and Teaching Academic Standards for Architecture In particular, we acknowledge your close consultation with the NEC on the project along with a comprehensive cross-section of the professional and academic communities in architecture. The TLOs with the nuanced levels of capacities – to identify, develop, explain, demonstrate etc – are described at an appropriate level to be understood as minimum expectations for a Master of Architecture graduate. The Architects Accreditation Council of Australia (AACA) has noted: There is a clear correlation between the current processes for accreditation and what may be the procedures in the future following the current review. The requirement of the outcomes as outlined in the draft paper to demonstrate capability is an appropriate way of expressing the measure of whether the learning outcomes have been achieved. The measure of capability as described in the outcome statements is enhanced with explanatory descriptions in the accompanying notes.
Resumo:
Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.