985 resultados para Open Standards


Relevância:

30.00% 30.00%

Publicador:

Resumo:

After attending this presentation, attendees will gain awareness of the ontogeny of cranial maturation, specifically: (1) the fusion timings of primary ossification centers in the basicranium; and (2) the temporal pattern of closure of the anterior fontanelle, to develop new population-specific age standards for medicolegal death investigation of Australian subadults. This presentation will impact the forensic science community by demonstrating the potential of a contemporary forensic subadult Computed Tomography (CT) database of cranial scans and population data, to recalibrate existing standards for age estimation and quantify growth and development of Australian children. This research welcomes a study design applicable to all countries faced with paucity in skeletal repositories. Accurate assessment of age-at-death of skeletal remains represents a key element in forensic anthropology methodology. In Australian casework, age standards derived from American reference samples are applied in light of scarcity in documented Australian skeletal collections. Currently practitioners rely on antiquated standards, such as the Scheuer and Black1 compilation for age estimation, despite implications of secular trends and population variation. Skeletal maturation standards are population specific and should not be extrapolated from one population to another, while secular changes in skeletal dimensions and accelerated maturation underscore the importance of establishing modern standards to estimate age in modern subadults. Despite CT imaging becoming the gold standard for skeletal analysis in Australia, practitioners caution the application of forensic age standards derived from macroscopic inspection to a CT medium, suggesting a need for revised methodologies. Multi-slice CT scans of subadult crania and cervical vertebrae 1 and 2 were acquired from 350 Australian individuals (males: n=193, females: n=157) aged birth to 12 years. The CT database, projected at 920 individuals upon completion (January 2014), comprises thin-slice DICOM data (resolution: 0.5/0.3mm) of patients scanned since 2010 at major Brisbane Childrens Hospitals. DICOM datasets were subject to manual segmentation, followed by the construction of multi-planar and volume rendering cranial models, for subsequent scoring. The union of primary ossification centers of the occipital bone were scored as open, partially closed or completely closed; while the fontanelles, and vertebrae were scored in accordance with two stages. Transition analysis was applied to elucidate age at transition between union states for each center, and robust age parameters established using Bayesian statistics. In comparison to reported literature, closure of the fontanelles and contiguous sutures in Australian infants occur earlier than reported, with the anterior fontanelle transitioning from open to closed at 16.7±1.1 months. The metopic suture is closed prior to 10 weeks post-partum and completely obliterated by 6 months of age, independent of sex. Utilizing reverse engineering capabilities, an alternate method for infant age estimation based on quantification of fontanelle area and non-linear regression with variance component modeling will be presented. Closure models indicate that the greatest rate of change in anterior fontanelle area occurs prior to 5 months of age. This study complements the work of Scheuer and Black1, providing more specific age intervals for union and temporal maturity of each primary ossification center of the occipital bone. For example, dominant fusion of the sutura intra-occipitalis posterior occurs before 9 months of age, followed by persistence of a hyaline cartilage tongue posterior to the foramen magnum until 2.5 years; with obliteration at 2.9±0.1 years. Recalibrated age parameters for the atlas and axis are presented, with the anterior arch of the atlas appearing at 2.9 months in females and 6.3 months in males; while dentoneural, dentocentral and neurocentral junctions of the axis transitioned from non-union to union at 2.1±0.1 years in females and 3.7±0.1 years in males. These results are an exemplar of significant sexual dimorphism in maturation (p<0.05), with girls exhibiting union earlier than boys, justifying the need for segregated sex standards for age estimation. Studies such as this are imperative for providing updated standards for Australian forensic and pediatric practice and provide an insight into skeletal development of this population. During this presentation, the utility of novel regression models for age estimation of infants will be discussed, with emphasis on three-dimensional modeling capabilities of complex structures such as fontanelles, for the development of new age estimation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction The multifactorial nature of clinical skills development makes assessment of undergraduate radiation therapist competence level by clinical mentors challenging. A recent overhaul of the clinical assessment strategy at Queensland University of Technology has moved away from the high-stakes Observed Structured Clinical Examination (OSCE) to encompass a more continuous measure of competence. This quantitative study aimed to gather stakeholder evidence to inform development of standards by which to measure student competence for a range of levels of progression. Methods A simple anonymous questionnaire was distributed to all Queensland radiation therapists. The tool asked respondents to assign different levels of competency with a range of clinical tasks to different levels of student. All data were anonymous and was combined for analysis using Microsoft Excel. Results Feedback indicated good agreement with tasks that specified amount of direction required and this has been incorporated into the new clinical achievements record that the students need to have signed off. Additional puzzling findings suggested higher expectations with planning tasks than with treatment-based tasks. Conclusion The findings suggest that the amount of direction required by students is a valid indicator of their level and has been adopted into the clinical assessment scheme. Further work will build on this to further define standards of competency for undergraduates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

“Dissolved” (< 0.4 μm filtered) and “total dissolvable” (unfiltered) trace element samples were collected using “clean” sampling techniques from four vertical profiles in the eastern Atlantic Ocean on the first IOC Trace Metals Baseline expedition. The analytical results obtained by 9 participating laboratories for Mn, Fe, Co, Ni, Cu, Zn, Cd, Pb, and Se on samples from station 4 in the northeast Atlantic have been evaluated with respect to accuracy and precision (intercomparability). The data variability among the reporting laboratories was expressed as 2 × SD for a given element and depth, and was comparable to the 95% confidence interval reported for the NASS seawater reference standards (representing analytical variability only). The discrepancies between reporting laboratories appear to be due to inaccuracies in standardization (analytical calibration), blank correction, and/or extraction efficiency corrections.Several of the sampling bottles used at this station were not adequately pre-cleaned (anomalous Pb results). The sample filtration process did not appear to have been a source of contamination for either dissolved or particulate trace elements. The trace metal profiles agree in general with previously reported profiles from the Atlantic Ocean. We conclude that the sampling and analytical methods we have employed for this effort, while still in need of improvement, are sufficient for obtaining accurate concentration data on most trace metals in the major water masses of the oceans, and to enable some evaluation of the biogeochemical cycling of the metals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Im Zuge der weiteren Verbreitung der Social Media und der internetbasierten Lehre, gewinnen eLearning Inhalte immer mehr an Bedeutung. In den Kontext von eLearning und internetbasierter Lehre gehören auch Open Educational Resources (OER). OER sind digitale Lern- und Lehrmaterialien, die frei für Lehrende und Studierende zugänglich sind und auch frei verbreitet werden dürfen. [...] Um OER auszutauschen, zu finden, zu beschaffen und sie auf einer breiten Basis zugänglich zu machen, insbesondere auch über Suchmaschinen und dadurch verwenden zu können, werden für die jeweiligen Materialien Metadaten benötigt. [...] Um die Frage nach dem Handlungs- und Forschungsbedarf zum Thema Metadaten für Open Educational Resources zu untersuchen, wird zunächst ein Überblick über die momentan bestehenden nationalen und internationalen Metadatenstandards für eLearning Objekte gegeben. [...] Hieraus ergeben sich Empfehlungen, welche Metadaten-Standards für die weitere Nutzung und Förderung geeignet sein könnten. Es werden außerdem die Möglichkeiten der Erstellung eines neuen Metadaten-Standards sowie eines gemeinsamen Portals für OER erörtert. Hierbei wird vor allem auf die zu erwartenden Probleme und die damit verbundenen Anforderungen eingegangen." (DIPF/Orig.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a framework for a telecommunications interface which allows data from sensors embedded in Smart Grid applications to reliably archive data in an appropriate time-series database. The challenge in doing so is two-fold, firstly the various formats in which sensor data is represented, secondly the problems of telecoms reliability. A prototype of the authors' framework is detailed which showcases the main features of the framework in a case study featuring Phasor Measurement Units (PMU) as the application. Useful analysis of PMU data is achieved whenever data from multiple locations can be compared on a common time axis. The prototype developed highlights its reliability, extensibility and adoptability; features which are largely deferred from industry standards for data representation to proprietary database solutions. The open source framework presented provides link reliability for any type of Smart Grid sensor and is interoperable with existing proprietary database systems, and open database systems. The features of the authors' framework allow for researchers and developers to focus on the core of their real-time or historical analysis applications, rather than having to spend time interfacing with complex protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In diesem Working Paper sollen wesentliche Erkenntnisse und Forderungen aus der - bisher vor allem englischsprachigen - Diskussion über die webgerechte Freigabe öffentlicher Daten zusammengefaßt werden. Das Paper versteht sich als Ausgangspunkt für Diskussion und Strategieentwicklung, ohne letztere selbst leisten zu können. Die Entwicklungspotentiale von Open Government Data (OGD) sollen zunächst aus der Sicht verschiedener Beteiligter dargestellt werden. Mit den in den Sebastopol-Prinzipien formulierten grundlegenden Anforderungen an OGD wird der Begriff schließlich definiert. Anhand von Veröffentlichungen des W3C kann schließlich die Bedeutung der Verwendung und (Weiter-)Entwicklung offener Standards für OGD gezeigt werden, daneben aber auch die Hauptprobleme eines entsprechenden Change Managements im öffentlichen Sektor. Abschließend werden einige modellhafte Beispiele für die praktische Umsetzung von OGD angeführt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to demonstrate the importance of the concepts of rationality, reasonableness, culpability and autonomy that inform and support our conception of both the person and the punishable subject. A critical discourse analysis tracing these concepts through both the law and psychological tools used to evaluate the fitness of a person reveals that these concepts and their implied values are inconsistently applied to the mentally disordered who come into conflict with the law. I argue that the result of this inconsistency compromises a person's autonomy which is a contradiction to this concept as a foundational principle of the law. Ultimately, this thesis does not provide a solution to be employed in policy making, but its analysis leaves open possibilities for further exploration into the ways legal and social justice can be reconciled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This lecture introduces an array of data sources that can be used to create new applications and visualisations, many examples of which are given. Additionally, there are a number of slides on open data standards, freedom of information requests and how to affect the future of open data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: There is a lot of hype around the Internet of Things along with talk about 100 billion devices within 10 years time. The promise of innovative new services and efficiency savings is fueling interest in a wide range of potential applications across many sectors including smart homes, healthcare, smart grids, smart cities, retail, and smart industry. However, the current reality is one of fragmentation and data silos. W3C is seeking to fix that by exposing IoT platforms through the Web with shared semantics and data formats as the basis for interoperability. This talk will address the abstractions needed to move from a Web of pages to a Web of things, and introduce the work that is being done on standards and on open source projects for a new breed of Web servers on microcontrollers to cloud based server farms. Speaker Biography -Dave Raggett : Dave has been involved at the heart of web standards since 1992, and part of the W3C Team since 1995. As well as working on standards, he likes to dabble with software, and more recently with IoT hardware. He has participated in a wide range of European research projects on behalf of W3C/ERCIM. He currently focuses on Web payments, and realising the potential for the Web of Things as an evolution from the Web of pages. Dave has a doctorate from the University of Oxford. He is a visiting professor at the University of the West of England, and lives in the UK in a small town near to Bath.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of buying, selling or interacting with customers via Internet, Tele-sale, Smart card or other computer network is referred to as Electronics Commerce. Whereas online trade has been touting its flexibility, convenience and cost savings, the newest entrant is wireless e-commerce. This form of business offers many attractions; including 24 hours seven days’ open shop–business, vastly reduced fixed cost, and increased profitability. Amazon.com is an example of a successful venture, in e-business. Internet Service providers (ISP/ASP) have a significant influence on the feasibility, security and cost competitiveness of an e-business venture. In the ISP model of services, multiple users and their databases are normally offered on a single hardware, platform sharing the same IP address and Domain name. Clients will require a mechanism, which allows them to update their Web contents and databases frequently even many times daily without intervention of local system Administrator (ISP Admin). The paper overviews few steps to enable corporate clients to update their web content more securely.