602 resultados para Large property
Resumo:
Every day we hear someone complain that this or that patent should not have been granted. People complain that the patent system is now a threat to existing business and innovation be- cause the patent office grants with alarming regularity patents for inventions that are neither novel nor non-obvious. People argue that the patent office cannot keep up with the job of examining the backlog of hundreds of thousands of patents and that, even if it could, the large volumes of prior art literature that need to be considered each time a patent application is received make the decision as to whether a patent should be granted or not a treacherous one.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented
Resumo:
Conifers are resistant to attack from a large number of potential herbivores or pathogens. Previous molecular and biochemical characterization of selected conifer defence systems support a model of multigenic, constitutive and induced defences that act on invading insects via physical, chemical, biochemical or ecological (multitrophic) mechanisms. However, the genomic foundation of the complex defence and resistance mechanisms of conifers is largely unknown. As part of a genomics strategy to characterize inducible defences and possible resistance mechanisms of conifers against insect herbivory, we developed a cDNA microarray building upon a new spruce (Picea spp.) expressed sequence tag resource. This first-generation spruce cDNA microarray contains 9720 cDNA elements representing c. 5500 unique genes. We used this array to monitor gene expression in Sitka spruce (Picea sitchensis) bark in response to herbivory by white pine weevils (Pissodes strobi, Curculionidae) or wounding, and in young shoot tips in response to western spruce budworm (Choristoneura occidentalis, Lepidopterae) feeding. Weevils are stem-boring insects that feed on phloem, while budworms are foliage feeding larvae that consume needles and young shoot tips. Both insect species and wounding treatment caused substantial changes of the host plant transcriptome detected in each case by differential gene expression of several thousand array elements at 1 or 2 d after the onset of treatment. Overall, there was considerable overlap among differentially expressed gene sets from these three stress treatments. Functional classification of the induced transcripts revealed genes with roles in general plant defence, octadecanoid and ethylene signalling, transport, secondary metabolism, and transcriptional regulation. Several genes involved in primary metabolic processes such as photosynthesis were down-regulated upon insect feeding or wounding, fitting with the concept of dynamic resource allocation in plant defence. Refined expression analysis using gene-specific primers and real-time PCR for selected transcripts was in agreement with microarray results for most genes tested. This study provides the first large-scale survey of insect-induced defence transcripts in a gymnosperm and provides a platform for functional investigation of plant-insect interactions in spruce. Induction of spruce genes of octadecanoid and ethylene signalling, terpenoid biosynthesis, and phenolic secondary metabolism are discussed in more detail.
Resumo:
Objectives: This methodological paper reports on the development and validation of a work sampling instrument and data collection processes to conduct a national study of nurse practitioners’ work patterns. ---------- Design: Published work sampling instruments provided the basis for development and validation of a tool for use in a national study of nurse practitioner work activities across diverse contextual and clinical service models. Steps taken in the approach included design of a nurse practitioner-specific data collection tool and development of an innovative web-based program to train and establish inter rater reliability of a team of data collectors who were geographically dispersed across metropolitan, rural and remote health care settings. ---------- Setting: The study is part of a large funded study into nurse practitioner service. The Australian Nurse Practitioner Study is a national study phased over three years and was designed to provide essential information for Australian health service planners, regulators and consumer groups on the profile, process and outcome of nurse practitioner service. ---------- Results: The outcome if this phase of the study is empirically tested instruments, process and training materials for use in an international context by investigators interested in conducting a national study of nurse practitioner work practices. ---------- Conclusion: Development and preparation of a new approach to describing nurse practitioner practices using work sampling methods provides the groundwork for international collaboration in evaluation of nurse practitioner service.
Resumo:
Purpose, Design/methodology / approach The acknowledgement of state significance in relation to development projects can result in special treatment by regulatory authorities, particularly in terms of environmental compliance and certain economic and other government support measures. However, defining just what constitutes a “significant project”, or a project of “state significance”, varies considerably between Australian states. In terms of establishing threshold levels, in Queensland there is even less clarity. Despite this lack of definition, the implications of “state significance” can nevertheless be considerable. For example, in Queensland if the Coordinator-General declares a project to be a “significant project” under the State Development and Public Works Organisation Act 1971, the environmental impact assessment process may become more streamlined – potentially circumventing certain provisions under The Integrated Planning Act 1997. If the project is not large enough to be so deemed, an extractive resource under the State Planning Policy 2/07 - Protection of Extractive Resources 2007 may be considered to be of State or regional significance and subsequently designated as a “Key Resource Area”. As a consequence, such a project is afforded some measure of resource protection but remains subject to the normal assessment process under the Integrated Development Assessment System, as well as the usual requirements of the vegetation management codes, and other regulations. Findings (Originality/value) & Research limitations / implications This paper explores the various meanings of “state significance” in Queensland and the ramifications for development projects in that state. It argues for a streamlining of the assessment process in order to avoid or minimise constraints acting on the state’s development. In so doing, it questions the existence of a strategic threat to the delivery of an already over-stretched infrastructure program.
Resumo:
Using the Graduate Careers Australia’s Course Experience Questionnaire (CEQ), the students’ perceptions of the quality of property education in Australia is assessed over 1994-2009. Analyses are presented for the major property universities in Australia regarding good teaching and overall satisfaction, as well as the property discipline benchmarked against the property-related disciplines of accounting, building, business, economics, law and planning. The link between good teaching and overall satisfaction, and the delivery of added value by property programs are also assessed. Changes over this 16-year period are highlighted in terms of student perceptions of the quality of property education in Australia.
Resumo:
The treatment of challenging fractures and large osseous defects presents a formidable problem for orthopaedic surgeons. Tissue engineering/regenerative medicine approaches seek to solve this problem by delivering osteogenic signals within scaffolding biomaterials. In this study, we introduce a hybrid growth factor delivery system that consists of an electrospun nanofiber mesh tube for guiding bone regeneration combined with peptide-modified alginate hydrogel injected inside the tube for sustained growth factor release. We tested the ability of this system to deliver recombinant bone morphogenetic protein-2 (rhBMP-2) for the repair of critically-sized segmental bone defects in a rat model. Longitudinal [mu]-CT analysis and torsional testing provided quantitative assessment of bone regeneration. Our results indicate that the hybrid delivery system resulted in consistent bony bridging of the challenging bone defects. However, in the absence of rhBMP-2, the use of nanofiber mesh tube and alginate did not result in substantial bone formation. Perforations in the nanofiber mesh accelerated the rhBMP-2 mediated bone repair, and resulted in functional restoration of the regenerated bone. [mu]-CT based angiography indicated that perforations did not significantly affect the revascularization of defects, suggesting that some other interaction with the tissue surrounding the defect such as improved infiltration of osteoprogenitor cells contributed to the observed differences in repair. Overall, our results indicate that the hybrid alginate/nanofiber mesh system is a promising growth factor delivery strategy for the repair of challenging bone injuries.
Resumo:
Safety interventions (e.g., median barriers, photo enforcement) and road features (e.g., median type and width) can influence crash severity, crash frequency, or both. Both dimensions—crash frequency and crash severity—are needed to obtain a full accounting of road safety. Extensive literature and common sense both dictate that crashes are not created equal, with fatalities costing society more than 1,000 times the cost of property damage crashes on average. Despite this glaring disparity, the profession has not unanimously embraced or successfully defended a nonarbitrary severity weighting approach for analyzing safety data and conducting safety analyses. It is argued here that the two dimensions (frequency and severity) are made available by intelligently and reliably weighting crash frequencies and converting all crashes to property-damage-only crash equivalents (PDOEs) by using comprehensive societal unit crash costs. This approach is analogous to calculating axle load equivalents in the prediction of pavement damage: for instance, a 40,000-lb truck causes 4,025 times more stress than does a 4,000-lb car and so simply counting axles is not sufficient. Calculating PDOEs using unit crash costs is the most defensible and nonarbitrary weighting scheme, allows for the simple incorporation of severity and frequency, and leads to crash models that are sensitive to factors that affect crash severity. Moreover, using PDOEs diminishes the errors introduced by underreporting of less severe crashes—an added benefit of the PDOE analysis approach. The method is illustrated with rural road segment data from South Korea (which in practice would develop PDOEs with Korean crash cost data).
Resumo:
Routing trains within passenger stations in major cities is a common scheduling problem for railway operation. Various studies have been undertaken to derive and formulate solutions to this route allocation problem (RAP) which is particularly evident in mainland China nowadays because of the growing traffic demand and limited station capacity. A reasonable solution must be selected from a set of available RAP solutions attained in the planning stage to facilitate station operation. The selection is however based on the experience of the operators only and objective evaluation of the solutions is rarely addressed. In order to maximise the utilisation of station capacity while maintaining service quality and allowing for service disturbance, quantitative evaluation of RAP solutions is highly desirable. In this study, quantitative evaluation of RAP solutions is proposed and it is enabled by a set of indices covering infrastructure utilisation, buffer times and delay propagation. The proposed evaluation is carried out on a number of RAP solutions at a real-life busy railway station in mainland China and the results highlight the effectiveness of the indices in pinpointing the strengths and weaknesses of the solutions. This study provides the necessary platform to improve the RAP solution in planning and to allow train re-routing upon service disturbances.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective and less error-prone than developing them from scratch. Since process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. To make our approach more applicable, we consider the semantic similarity between labels. Experiments are conducted to demonstrate that our approach is efficient.
Resumo:
The Sascha-Pelligrini low-sulphidation epithermal system is located on the western edge of the Deseado Massif, Santa Cruz Province, Argentina. Outcrop sampling has returned values of up to 160g/t gold and 796g/t silver, with Mirasol Resources and Coeur D.Alene Mines currently exploring the property. Detailed mapping of the volcanic stratigraphy has defined three units that comprise the middle Jurassic Chon Aike Formation and two units that comprise the upper Jurassic La Matilde Formation. The Chon Aike Formation consists of rhyodacite ignimbrites and tuffs, with the La Matilde Formation including rhyolite ash and lithic tuffs. The volcanic sequence is intruded by a large flow-banded rhyolite dome, with small, spatially restricted granodiorite dykes and sills cropping out across the study area. ASTER multispectral mineral mapping, combined with PIMA (Portable Infrared Mineral Analyser) and XRD (X-ray diffraction) analysis defines an alteration pattern that zones from laumontite-montmorillonite, to illite-pyritechlorite, followed by a quartz-illite-smectite-pyrite-adularia vein selvage. Supergene kaolinite and steam-heated acid-sulphate kaolinite-alunite-opal alteration horizons crop out along the Sascha Vein trend and Pelligrini respectively. Paragenetically, epithermal veining varies from chalcedonic to saccharoidal with minor bladed textures, colloform/crustiform-banded with visible electrum and acanthite, crustiform-banded grey chalcedonic to jasperoidal with fine pyrite, and crystalline comb quartz. Geothermometry of mineralised veins constrains formation temperatures from 174.8 to 205.1¡ÆC and correlates with the stability field for the interstratified illite-smectite vein selvage. Vein morphology, mineralogy and associated alteration are controlled by host rock rheology, permeability, and depth of the palaeo-water table. Mineralisation within ginguro banded veins resulted from fluctuating fluid pH associated with selenide-rich magmatic pulses, pressure release boiling and wall-rock silicate buffering. The study of the Sascha-Pelligrini epithermal system will form the basis for a deposit-specific model helping to clarify the current understanding of epithermal deposits, and may serve as a template for exploration of similar epithermal deposits throughout Santa Cruz.
Resumo:
This paper investigates the current turbulent state of copyright in the digital age, and explores the viability of alternative compensation systems that aim to achieve the same goals with fewer negative consequences for consumers and artists. To sustain existing business models associated with creative content, increased recourse to DRM (Digital Rights Management) technologies, designed to restrict access to and usage of digital content, is well underway. Considerable technical challenges associated with DRM systems necessitate increasingly aggressive recourse to the law. A number of controversial aspects of copyright enforcement are discussed and contrasted with those inherent in levy based compensation systems. Lateral exploration of the copyright dilemma may help prevent some undesirable societal impacts, but with powerful coalitions of creative, consumer electronics and information technology industries having enormous vested interest in current models, alternative schemes are frequently treated dismissively. This paper focuses on consideration of alternative models that better suit the digital era whilst achieving a more even balance in the copyright bargain.
Resumo:
The period from 2007 to 2009 covered the residential property boom from early 2000, to the property recession following the Global Financial Crisis. Since late 2008, a number of residential property markets have suffered significant falls in house prices, buth this has not been consistent across all market sectors. This paper will analyze the housing market in Brisbane Australia to determine the impact, similarities and differences that the4 GFC had on range of residential sectors across a divesified property market. Data analysis will provide an overview of residential property prices, sales and listing volumes over the study period and will provide a comparison of median house price performance across the geographic and socio-economic areas of Brisbane.
Resumo:
With regard to the long-standing problem of the semantic gap between low-level image features and high-level human knowledge, the image retrieval community has recently shifted its emphasis from low-level features analysis to high-level image semantics extrac- tion. User studies reveal that users tend to seek information using high-level semantics. Therefore, image semantics extraction is of great importance to content-based image retrieval because it allows the users to freely express what images they want. Semantic content annotation is the basis for semantic content retrieval. The aim of image anno- tation is to automatically obtain keywords that can be used to represent the content of images. The major research challenges in image semantic annotation are: what is the basic unit of semantic representation? how can the semantic unit be linked to high-level image knowledge? how can the contextual information be stored and utilized for image annotation? In this thesis, the Semantic Web technology (i.e. ontology) is introduced to the image semantic annotation problem. Semantic Web, the next generation web, aims at mak- ing the content of whatever type of media not only understandable to humans but also to machines. Due to the large amounts of multimedia data prevalent on the Web, re- searchers and industries are beginning to pay more attention to the Multimedia Semantic Web. The Semantic Web technology provides a new opportunity for multimedia-based applications, but the research in this area is still in its infancy. Whether ontology can be used to improve image annotation and how to best use ontology in semantic repre- sentation and extraction is still a worth-while investigation. This thesis deals with the problem of image semantic annotation using ontology and machine learning techniques in four phases as below. 1) Salient object extraction. A salient object servers as the basic unit in image semantic extraction as it captures the common visual property of the objects. Image segmen- tation is often used as the �rst step for detecting salient objects, but most segmenta- tion algorithms often fail to generate meaningful regions due to over-segmentation and under-segmentation. We develop a new salient object detection algorithm by combining multiple homogeneity criteria in a region merging framework. 2) Ontology construction. Since real-world objects tend to exist in a context within their environment, contextual information has been increasingly used for improving object recognition. In the ontology construction phase, visual-contextual ontologies are built from a large set of fully segmented and annotated images. The ontologies are composed of several types of concepts (i.e. mid-level and high-level concepts), and domain contextual knowledge. The visual-contextual ontologies stand as a user-friendly interface between low-level features and high-level concepts. 3) Image objects annotation. In this phase, each object is labelled with a mid-level concept in ontologies. First, a set of candidate labels are obtained by training Support Vectors Machines with features extracted from salient objects. After that, contextual knowledge contained in ontologies is used to obtain the �nal labels by removing the ambiguity concepts. 4) Scene semantic annotation. The scene semantic extraction phase is to get the scene type by using both mid-level concepts and domain contextual knowledge in ontologies. Domain contextual knowledge is used to create scene con�guration that describes which objects co-exist with which scene type more frequently. The scene con�guration is represented in a probabilistic graph model, and probabilistic inference is employed to calculate the scene type given an annotated image. To evaluate the proposed methods, a series of experiments have been conducted in a large set of fully annotated outdoor scene images. These include a subset of the Corel database, a subset of the LabelMe dataset, the evaluation dataset of localized semantics in images, the spatial context evaluation dataset, and the segmented and annotated IAPR TC-12 benchmark.