971 resultados para Pitt, Christopher, 1699-1748


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The XML Document Mining track was launched for exploring two main ideas: (1) identifying key problems and new challenges of the emerging field of mining semi-structured documents, and (2) studying and assessing the potential of Machine Learning (ML) techniques for dealing with generic ML tasks in the structured domain, i.e., classification and clustering of semi-structured documents. This track has run for six editions during INEX 2005, 2006, 2007, 2008, 2009 and 2010. The first five editions have been summarized in previous editions and we focus here on the 2010 edition. INEX 2010 included two tasks in the XML Mining track: (1) unsupervised clustering task and (2) semi-supervised classification task where documents are organized in a graph. The clustering task requires the participants to group the documents into clusters without any knowledge of category labels using an unsupervised learning algorithm. On the other hand, the classification task requires the participants to label the documents in the dataset into known categories using a supervised learning algorithm and a training set. This report gives the details of clustering and classification tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart matrices are required in bone tissueengineered grafts that provide an optimal environment for cells and retain osteo-inductive factors for sustained biological activity. We hypothesized that a slow-degrading heparin-incorporated hyaluronan (HA) hydrogel can preserve BMP-2; while an arterio–venous (A–V) loop can support axial vascularization to provide nutrition for a bioartificial bone graft. HA was evaluated for osteoblast growth and BMP-2 release. Porous PLDLLA–TCP–PCL scaffolds were produced by rapid prototyping technology and applied in vivo along with HA-hydrogel, loaded with either primary osteoblasts or BMP-2. A microsurgically created A–V loop was placed around the scaffold, encased in an isolation chamber in Lewis rats. HA-hydrogel supported growth of osteoblasts over 8 weeks and allowed sustained release of BMP-2 over 35 days. The A–V loop provided an angiogenic stimulus with the formation of vascularized tissue in the scaffolds. Bone-specific genes were detected by real time RT-PCR after 8 weeks. However, no significant amount of bone was observed histologically. The heterotopic isolation chamber in combination with absent biomechanical stimulation might explain the insufficient bone formation despite adequate expression of bone-related genes. Optimization of the interplay of osteogenic cells and osteo-inductive factors might eventually generate sufficient amounts of axially vascularized bone grafts for reconstructive surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the characterisation for airborne uses of the public mobile data communication systems known broadly as 3G. The motivation for this study was to explore how this mature public communication systems could be used for aviation purposes. An experimental system was fitted to a light aircraft to record communication latency, line speed, RF level, packet loss and cell tower identifier. Communications was established using internet protocols and connection was made to a local server. The aircraft was flown in both remote and populous areas at altitudes up to 8500ft in a region located in South East Queensland, Australia. Results show that the average airborne RF levels are better than those on the ground by 21% and in the order of -77 dbm. Latencies were in the order of 500 ms (1/2 the latency of Iridium), an average download speed of 0.48 Mb/s, average uplink speed of 0.85 Mb/s, a packet of information loss of 6.5%. The maximum communication range was also observed to be 70km from a single cell station. The paper also describes possible limitations and utility of using such a communications architecture for both manned and unmanned aircraft systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Techniques for the accurate measurement of ionising radiation have been evolving since Roentgen first discovered x-rays in 1895; until now experimental measurements of radiation fields in the three spatial dimensions plus time have not been successfully demonstrated. In this work, we embed an organic plastic scintillator in a polymer gel dosimeter to obtain the first quasi-4D experimental measurement of a radiation field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations (stop-and-go driving). The negative environmental impacts of these oscillations are widely accepted, but their impact on traffic safety has been debated. This paper describes the impact of freeway traffic oscillations on traffic safety. This study employs a matched case-control design using high-resolution traffic and crash data from a freeway segment. Traffic conditions prior to each crash were taken as cases, while traffic conditions during the same periods on days without crashes were taken as controls. These were also matched by presence of congestion, geometry and weather. A total of 82 cases and about 80,000 candidate controls were extracted from more than three years of data from 2004 to 2007. Conditional logistic regression models were developed based on the case-control samples. To verify consistency in the results, 20 different sets of controls were randomly extracted from the candidate pool for varying control-case ratios. The results reveal that the standard deviation of speed (thus, oscillations) is a significant variable, with an average odds ratio of about 1.08. This implies that the likelihood of a (rear-end) crash increases by about 8% with an additional unit increase in the standard deviation of speed. The average traffic states prior to crashes were less significant than the speed variations in congestion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this feasibility study an organic plastic scintillator is calibrated against ionisation chamber measurements and then embedded in a polymer gel dosimeter to obtain a quasi-4D experimental measurement of a radiation field. This hybrid dosimeter was irradiated with a linear accelerator, with temporal measurements of the dose rate being acquired by the scintillator and spatial measurements acquired with the gel dosimeter. The detectors employed in this work are radiologically equivalent; and we show that neither detector perturbs the intensity of the radiation field of the other. By employing these detectors in concert, spatial and temporal variations in the radiation intensity can now be detected and gel dosimeters can be calibrated for absolute dose from a single irradiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structure and dynamics of a modern business environment are very hard to model using traditional methods. Such complexity raises challenges to effective business analysis and improvement. The importance of applying business process simulation to analyze and improve business activities has been widely recognized. However, one remaining challenge is the development of approaches to human resource behavior simulation. To address this problem, we describe a novel simulation approach where intelligent agents are used to simulate human resources by performing allocated work from a workflow management system. The behavior of the intelligent agents is driven a by state transition mechanism called a Hierarchical Task Network (HTN). We demonstrate and validate our simulator via a medical treatment process case study. Analysis of the simulation results shows that the behavior driven by the HTN is consistent with design of the workflow model. We believe these preliminary results support the development of more sophisticated agent-based human resource simulation systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present pyktree, an implementation of the K-tree algorithm in the Python programming language. The K-tree algorithm provides highly balanced search trees for vector quantization that scales up to very large data sets. Pyktree is highly modular and well suited for rapid-prototyping of novel distance measures and centroid representations. It is easy to install and provides a python package for library use as well as command line tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of how to efficiently and safely design dose finding studies. Both current and novel utility functions are explored using Bayesian adaptive design methodology for the estimation of a maximum tolerated dose (MTD). In particular, we explore widely adopted approaches such as the continual reassessment method and minimizing the variance of the estimate of an MTD. New utility functions are constructed in the Bayesian framework and are evaluated against current approaches. To reduce computing time, importance sampling is implemented to re-weight posterior samples thus avoiding the need to draw samples using Markov chain Monte Carlo techniques. Further, as such studies are generally first-in-man, the safety of patients is paramount. We therefore explore methods for the incorporation of safety considerations into utility functions to ensure that only safe and well-predicted doses are administered. The amalgamation of Bayesian methodology, adaptive design and compound utility functions is termed adaptive Bayesian compound design (ABCD). The performance of this amalgamation of methodology is investigated via the simulation of dose finding studies. The paper concludes with a discussion of results and extensions that could be included into our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to play freely in our cities is essential for sustainable wellbeing. When integrated successfully into our cities, Urban Play performs an important role; physically, socially and culturally contributing to the image of the city. While Urban Play is essential, it also finds itself in conflict with the city. Under modernist urban approaches play activities have become progressively segregated from the urban context through a tripartite of design, procurement and management practices. Despite these restrictions, emergent underground play forms overcome the isolation of play within urban space. One of these activities (parkour) is used as an evocative case study to reveal the hidden urban terrains of desire and fear as it re-interprets the fabric of the city, eliciting practice based discussions about procurement, design and management practice along its route.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My aim in this paper is to challenge the increasingly common view in the literature that the law on end of life decision making is in disarray and is in need of urgent reform. My argument is that this assessment of the law is based on assumptions about the relationship between the identity of the defendant and their conduct, and about the nature of causation, which, on examination, prove to be indefensible. I then provide a clarification of the relationship between causation and omissions which proves that the current legal position does not need modification, at least on the grounds that are commonly advanced for the converse view. This enables me, in conclusion, to clarify important conceptual and moral differences between withholding, refusing and withdrawing life-sustaining measures on the one hand, and assisted suicide and euthanasia, on the other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proportion of functional sequence in the human genome is currently a subject of debate. The most widely accepted figure is that approximately 5% is under purifying selection. In Drosophila, estimates are an order of magnitude higher, though this corresponds to a similar quantity of sequence. These estimates depend on the difference between the distribution of genomewide evolutionary rates and that observed in a subset of sequences presumed to be neutrally evolving. Motivated by the widening gap between these estimates and experimental evidence of genome function, especially in mammals, we developed a sensitive technique for evaluating such distributions and found that they are much more complex than previously apparent. We found strong evidence for at least nine well-resolved evolutionary rate classes in an alignment of four Drosophila species and at least seven classes in an alignment of four mammals, including human. We also identified at least three rate classes in human ancestral repeats. By positing that the largest of these ancestral repeat classes is neutrally evolving, we estimate that the proportion of nonneutrally evolving sequence is 30% of human ancestral repeats and 45% of the aligned portion of the genome. However, we also question whether any of the classes represent neutrally evolving sequences and argue that a plausible alternative is that they reflect variable structure-function constraints operating throughout the genomes of complex organisms.