70 resultados para Infancy

em Queensland University of Technology - ePrints Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Advanced paternal age (APA) is associated with an increased risk of neurodevelopmental disorders such as autism and schizophrenia, as well as with dyslexia and reduced intelligence. The aim of this study was to examine the relationship between paternal age and performance on neurocognitive measures during infancy and childhood. Methods and Findings A sample of singleton children (n = 33,437) was drawn from the US Collaborative Perinatal Project. The outcome measures were assessed at 8 mo, 4 y, and 7 y (Bayley scales, Stanford Binet Intelligence Scale, Graham-Ernhart Block Sort Test, Wechsler Intelligence Scale for Children, Wide Range Achievement Test). The main analyses examined the relationship between neurocognitive measures and paternal or maternal age when adjusted for potential confounding factors. Advanced paternal age showed significant associations with poorer scores on all of the neurocognitive measures apart from the Bayley Motor score. The findings were broadly consistent in direction and effect size at all three ages. In contrast, advanced maternal age was generally associated with better scores on these same measures. Conclusions The offspring of older fathers show subtle impairments on tests of neurocognitive ability during infancy and childhood. In light of secular trends related to delayed fatherhood, the clinical implications and the mechanisms underlying these findings warrant closer scrutiny.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Rapid weight gain in infancy is an important predictor of obesity in later childhood. Our aim was to determine which modifiable variables are associated with rapid weight gain in early life. Methods: Subjects were healthy infants enrolled in NOURISH, a randomised, controlled trial evaluating an intervention to promote positive early feeding practices. This analysis used the birth and baseline data for NOURISH. Birthweight was collected from hospital records and infants were also weighed at baseline assessment when they were aged 4-7 months and before randomisation. Infant feeding practices and demographic variables were collected from the mother using a self administered questionnaire. Rapid weight gain was defined as an increase in weight-for-age Z-score (using WHO standards) above 0.67 SD from birth to baseline assessment, which is interpreted clinically as crossing centile lines on a growth chart. Variables associated with rapid weight gain were evaluated using a multivariable logistic regression model. Results: Complete data were available for 612 infants (88% of the total sample recruited) with a mean (SD) age of 4.3 (1.0) months at baseline assessment. After adjusting for mother's age, smoking in pregnancy, BMI, and education and infant birthweight, age, gender and introduction of solid foods, the only two modifiable factors associated with rapid weight gain to attain statistical significance were formula feeding [OR=1.72 (95%CI 1.01-2.94), P= 0.047] and feeding on schedule [OR=2.29 (95%CI 1.14-4.61), P=0.020]. Male gender and lower birthweight were non-modifiable factors associated with rapid weight gain. Conclusions: This analysis supports the contention that there is an association between formula feeding, feeding to schedule and weight gain in the first months of life. Mechanisms may include the actual content of formula milk (e.g. higher protein intake) or differences in feeding styles, such as feeding to schedule, which increase the risk of overfeeding. Trial Registration: Australian Clinical Trials Registry ACTRN12608000056392

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION Influenza vaccination in pregnancy is recommended for all women in Australia, particularly those who will be in their second or third trimester during the influenza season. However, there has been no systematic monitoring of influenza vaccine uptake among pregnant women in Australia. Evidence is emerging of benefit to the infant with respect to preventing influenza infection in the first 6 months of life. The FluMum study aims to systematically monitor influenza vaccine uptake during pregnancy in Australia and determine the effectiveness of maternal vaccination in preventing laboratory-confirmed influenza in their offspring up to 6 months of age. METHODS AND ANALYSIS A prospective cohort study of 10 106 mother-infant pairs recruited between 38 weeks gestation and 55 days postdelivery in six Australian capital cities. Detailed maternal and infant information is collected at enrolment, including influenza illness and vaccination history with a follow-up data collection time point at infant age 6 months. The primary outcome is laboratory-confirmed influenza in the infant. Case ascertainment occurs through searches of Australian notifiable diseases data sets once the infant turns 6 months of age (with parental consent). The primary analysis involves calculating vaccine effectiveness against laboratory-confirmed influenza by comparing the incidence of influenza in infants of vaccinated mothers to the incidence in infants of unvaccinated mothers. Secondary analyses include annual and pooled estimates of the proportion of mothers vaccinated during pregnancy, the effectiveness of maternal vaccination in preventing hospitalisation for acute respiratory illness and modelling to assess the determinants of vaccination. ETHICS AND DISSEMINATION The study was approved by all institutional Human Research Ethics Committees responsible for participating sites. Study findings will be published in peer review journals and presented at national and international conferences. TRIAL REGISTRATION NUMBER The study is registered with the Australia and New Zealand Clinical Trials Registry (ANZCTR) number: 12612000175875.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Xanthine oxidase (XO) is distributed in mammals largely in the liver and small intestine, but also is highly active in milk where it generates hydrogen peroxide (H2O2). Adult human saliva is low in hypoxanthine and xanthine, the substrates of XO, and high in the lactoperoxidase substrate thiocyanate, but saliva of neonates has not been examined. Results Median concentrations of hypoxanthine and xanthine in neonatal saliva (27 and 19 μM respectively) were ten-fold higher than in adult saliva (2.1 and 1.7 μM). Fresh breastmilk contained 27.3±12.2 μM H2O2 but mixing baby saliva with breastmilk additionally generated >40 μM H2O2, sufficient to inhibit growth of the opportunistic pathogens Staphylococcus aureus and Salmonella spp. Oral peroxidase activity in neonatal saliva was variable but low (median 7 U/L, range 2–449) compared to adults (620 U/L, 48–1348), while peroxidase substrate thiocyanate in neonatal saliva was surprisingly high. Baby but not adult saliva also contained nucleosides and nucleobases that encouraged growth of the commensal bacteria Lactobacillus, but inhibited opportunistic pathogens; these nucleosides/bases may also promote growth of immature gut cells. Transition from neonatal to adult saliva pattern occurred during the weaning period. A survey of saliva from domesticated mammals revealed wide variation in nucleoside/base patterns. Discussion and Conclusion During breast-feeding, baby saliva reacts with breastmilk to produce reactive oxygen species, while simultaneously providing growth-promoting nucleotide precursors. Milk thus plays more than a simply nutritional role in mammals, interacting with infant saliva to produce a potent combination of stimulatory and inhibitory metabolites that regulate early oral–and hence gut–microbiota. Consequently, milk-saliva mixing appears to represent unique biochemical synergism which boosts early innate immunity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Principal Topic: It is well known that most new ventures suffer from a significant lack of resources, which increases the risk of failure (Shepherd, Douglas and Shanley, 2000) and makes it difficult to attract stakeholders and financing for the venture (Bhide & Stevenson, 1999). The Resource-Based View (RBV) (Barney, 1991; Wernerfelt, 1984) is a dominant theoretical base increasingly drawn on within Strategic Management. While theoretical contributions applying RBV in the domain of entrepreneurship can arguably be traced back to Penrose (1959), there has been renewed attention recently (e.g. Alvarez & Busenitz, 2001; Alvarez & Barney, 2004). This said, empirical work is in its infancy. In part, this may be due to a lack of well developed measuring instruments for testing ideas derived from RBV. The purpose of this study is to develop a measurement scales that can serve to assist such empirical investigations. In so doing we will try to overcome three deficiencies in current empirical measures used for the application of RBV to the entrepreneurship arena. First, measures for resource characteristics and configurations associated with typical competitive advantages found in entrepreneurial firms need to be developed. These include such things as alertness and industry knowledge (Kirzner, 1973), flexibility (Ebben & Johnson, 2005), strong networks (Lee et al., 2001) and within knowledge intensive contexts, unique technical expertise (Wiklund and Shepard, 2003). Second, the RBV has the important limitations of being relatively static and modelled on large, established firms. In that context, traditional RBV focuses on competitive advantages. However, newly established firms often face disadvantages, especially those associated with the liabilities of newness (Aldrich & Auster, 1986). It is therefore important in entrepreneurial contexts to expand to an investigation of responses to competitive disadvantage through an RBV lens. Conversely, recent research has suggested that resource constraints actually have a positive effect on firm growth and performance under some circumstances (e.g., George, 2005; Katila & Shane, 2005; Mishina et al., 2004; Mosakowski, 2002; cf. also Baker & Nelson, 2005). Third, current empirical applications of RBV measured levels or amounts of particular resources available to a firm. They infer that these resources deliver firms competitive advantage by establishing a relationship between these resource levels and performance (e.g. via regression on profitability). However, there is the opportunity to directly measure the characteristics of resource configurations that deliver competitive advantage, such as Barney´s well known VRIO (Valuable, Rare, Inimitable and Organized) framework (Barney, 1997). Key Propositions and Methods: The aim of our study is to develop and test scales for measuring resource advantages (and disadvantages) and inimitability for entrepreneurial firms. The study proceeds in three stages. The first stage developed our initial scales based on earlier literature. Where possible, we adapt scales based on previous work. The first block of the scales related to the level of resource advantages and disadvantages. Respondents were asked the degree to which each resource category represented an advantage or disadvantage relative to other businesses in their industry on a 5 point response scale: Major Disadvantage, Slight Disadvantage, No Advantage or Disadvantage, Slight Advantage and Major Advantage. Items were developed as follows. Network capabilities (3 items) were adapted from (Madsen, Alsos, Borch, Ljunggren & Brastad, 2006). Knowledge resources marketing expertise / customer service (3 items) and technical expertise (3 items) were adapted from Wiklund and Shepard (2003). flexibility (2 items), costs (4 items) were adapted from JIBS B97. New scales were developed for industry knowledge / alertness (3 items) and product / service advantages. The second block asked the respondent to nominate the most important resource advantage (and disadvantage) of the firm. For the advantage, they were then asked four questions to determine how easy it would be for other firms to imitate and/or substitute this resource on a 5 point likert scale. For the disadvantage, they were asked corresponding questions related to overcoming this disadvantage. The second stage involved two pre-tests of the instrument to refine the scales. The first was an on-line convenience sample of 38 respondents. The second pre-test was a telephone interview with a random sample of 31 Nascent firms and 47 Young firms (< 3 years in operation) generated using a PSED method of randomly calling households (Gartner et al. 2004). Several items were dropped or reworded based on the pre-tests. The third stage (currently in progress) is part of Wave 1 of CAUSEE (Nascent Firms) and FEDP (Young Firms), a PSED type study being conducted in Australia. The scales will be tested and analysed with a random sample of approximately 700 Nascent and Young firms respectively. In addition, a judgement sample of approximately 100 high potential businesses in each category will be included. Findings and Implications: The paper will report the results of the main study (stage 3 – currently data collection is in progress) will allow comparison of the level of resource advantage / disadvantage across various sub-groups of the population. Of particular interest will be a comparison of the high potential firms with the random sample. Based on the smaller pre-tests (N=38 and N=78) the factor structure of the items confirmed the distinctiveness of the constructs. The reliabilities are within an acceptable range: Cronbach alpha ranged from 0.701 to 0.927. The study will provide an opportunity for researchers to better operationalize RBV theory in studies within the domain of entrepreneurship. This is a fundamental requirement for the ability to test hypotheses derived from RBV in systematic, large scale research studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“SOH see significant benefit in digitising its drawings and operation and maintenance manuals. Since SOH do not currently have digital models of the Opera House structure or other components, there is an opportunity for this national case study to promote the application of Digital Facility Modelling using standardized Building Information Models (BIM)”. The digital modelling element of this project examined the potential of building information models for Facility Management focusing on the following areas: • The re-usability of building information for FM purposes • BIM as an Integrated information model for facility management • Extendibility of the BIM to cope with business specific requirements • Commercial facility management software using standardised building information models • The ability to add (organisation specific) intelligence to the model • A roadmap for SOH to adopt BIM for FM The project has established that BIM – building information modelling - is an appropriate and potentially beneficial technology for the storage of integrated building, maintenance and management data for SOH. Based on the attributes of a BIM, several advantages can be envisioned: consistency in the data, intelligence in the model, multiple representations, source of information for intelligent programs and intelligent queries. The IFC – open building exchange standard – specification provides comprehensive support for asset and facility management functions, and offers new management, collaboration and procurement relationships based on sharing of intelligent building data. The major advantages of using an open standard are: information can be read and manipulated by any compliant software, reduced user “lock in” to proprietary solutions, third party software can be the “best of breed” to suit the process and scope at hand, standardised BIM solutions consider the wider implications of information exchange outside the scope of any particular vendor, information can be archived as ASCII files for archival purposes, and data quality can be enhanced as the now single source of users’ information has improved accuracy, correctness, currency, completeness and relevance. SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. There have been remarkably few technical difficulties in converting the House’s existing conventions and standards to the new model based environment. This demonstrates that the IFC model represents world practice for building data representation and management (see Sydney Opera House – FM Exemplar Project Report Number 2005-001-C-3, Open Specification for BIM: Sydney Opera House Case Study). Availability of FM applications based on BIM is in its infancy but focussed systems are already in operation internationally and show excellent prospects for implementation systems at SOH. In addition to the generic benefits of standardised BIM described above, the following FM specific advantages can be expected from this new integrated facilities management environment: faster and more effective processes, controlled whole life costs and environmental data, better customer service, common operational picture for current and strategic planning, visual decision-making and a total ownership cost model. Tests with partial BIM data – provided by several of SOH’s current consultants – show that the creation of a SOH complete model is realistic, but subject to resolution of compliance and detailed functional support by participating software applications. The showcase has demonstrated successfully that IFC based exchange is possible with several common BIM based applications through the creation of a new partial model of the building. Data exchanged has been geometrically accurate (the SOH building structure represents some of the most complex building elements) and supports rich information describing the types of objects, with their properties and relationships.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Crash risk is the statistical probability of a crash. Its assessment can be performed through ex post statistical analysis or in real-time with on-vehicle systems. These systems can be cooperative. Cooperative Vehicle-Infrastructure Systems (CVIS) are a developing research avenue in the automotive industry worldwide. This paper provides a survey of existing CVIS systems and methods to assess crash risk with them. It describes the advantages of cooperative systems versus non-cooperative systems. A sample of cooperative crash risk assessment systems is analysed to extract vulnerabilities according to three criteria: market penetration, over-reliance on GPS and broadcasting issues. It shows that cooperative risk assessment systems are still in their infancy and requires further development to provide their full benefits to road users.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive function (EF) emerges in infancy and continues to develop throughout childhood. Executive dysfunction is believed to contribute to learning and attention problems in children at school age. Children born very preterm are more prone to these problems than their full-term peers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Principal Topic : Nascent entrepreneurship has drawn the attention of scholars in the last few years (Davidsson, 2006, Wagner, 2004). However, most studies have asked why firms are created focussing on questions such as what are the characteristics (Delmar and Davidsson, 2000) and motivations (Carter, Gartner, Shaver & Reynolds, 2004) of nascent entrepreneurs, or what are the success factors in venture creation (Davidsson & Honig; 2003; Delmar and Shane, 2004). In contrast, the question of how companies emerge is still in its infancy. On a theoretical side, effectuation, developed by Sarasvathy (2001) offers one view of the strategies that may be at work during the venture creation process. Causation, the theorized inverse to effectuation, may be described as a rational reasoning method to create a company. After a comprehensive market analysis to discover opportunities, the entrepreneur will select the alternative with the higher expected return and implement it through the use of a business plan. In contrast, effectuation suggests that the future entrepreneur will develop her new venture in a more iterative way by selecting possibilities through flexibility and interaction with the market, affordability of loss of resources and time invested, development of pre-commitments and alliances from stakeholders. Another contrasting point is that causation is ''goal driven'' while an effectual approach is ''mean driven'' (Sarasvathy, 2001) One of the predictions of effectuation theory is effectuation is more likely to be used by entrepreneurs early in the venture creation process (Sarasvathy, 2001). However, this temporal aspect and the impact of the effectuation strategy on the venture outcomes has so far not been systematically and empirically tested on large samples. The reason behind this research gap is twofold. Firstly, few studies collect longitudinal data on emerging ventures at an early enough stage of development to avoid severe survivor bias. Second, the studies that collect such data have not included validated measures of effectuation. The research we are conducting attempts to partially fill this gap by combining an empirical investigation on a large sample of nascent and young firms with the effectuation/causation continuum as a basis (Sarasvathy, 2001). The objectives are to understand the strategies used by the firms during the creation process and measure their impacts on the firm outcomes. Methodology/Key Propositions : This study draws its data from the first wave of the CAUSEE project where 28,383 Australian households were randomly contacted by phone using a specific methodology to capture emerging firms (Davidsson, Steffens, Gordon, Reynolds, 2008). This screening led to the identification of 594 nascent ventures (i.e., firms that are not operating yet) and 514 young firms (i.e., firms that have started operating from 2004) that were willing to participate in the study. Comprehensive phone interviews were conducted with these 1108 ventures. In a likewise comprehensive follow-up 12 months later, 80% of the eligible cases completed the interview. The questionnaire contains specific sections designed to distinguish effectual and causal processes, innovation, gestation activities, business idea changes and ventures outcomes. The effectuation questions are based on the components of effectuation strategy as described by Sarasvathy (2001) namely: flexibility, affordable loss and pre-commitment from stakeholders. Results from two rounds of pre-testing informed the design of the instrument included in the main survey. The first two waves of data have will be used to test and compare the use of effectuation in the venture creation process. To increase the robustness of the results, temporal use of effectuation will be tested both directly and indirectly. 1. By comparing the use of effectuation in nascent and young firms from wave 1 to 2, we will be able to find out how effectuation is affected by time over a 12-month duration and if the stage of venture development has an impact on its use. 2. By comparing nascent ventures early in the creation process versus nascent ventures late in the creation process. Early versus late can be determined with the help of time-stamped gestation activity questions included in the survey. This will help us to determine the change on a small time scale during the creation phase of the venture. 3. By comparing nascent firms to young (already operational) firms. 4. By comparing young firms becoming operational in 2006 with those first becoming operational in 2004. Results and Implications : Wave 1 and 2 data have been completed and wave 2 is currently being checked and 'cleaned'. Analysis work will commence in September, 2009. This paper is expected to contribute to the body of knowledge on effectuation by measuring quantitatively its use and impact on nascent and young firms activities at different stages of their development. In addition, this study will also increase the understanding of the venture creation process by comparing over time nascent and young firms from a large sample of randomly selected ventures. We acknowledge the results from this study will be preliminary and will have to be interpreted with caution as the changes identified may be due to several factors and may not only be attributed to the use/not use of effectuation. Meanwhile, we believe that this study is important to the field of entrepreneurship as it provides some much needed insights on the processes used by nascent and young firms during their creation and early operating stages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractional Fokker–Planck equations have been used to model several physical situations that present anomalous diffusion. In this paper, a class of time- and space-fractional Fokker–Planck equations (TSFFPE), which involve the Riemann–Liouville time-fractional derivative of order 1-α (α(0, 1)) and the Riesz space-fractional derivative (RSFD) of order μ(1, 2), are considered. The solution of TSFFPE is important for describing the competition between subdiffusion and Lévy flights. However, effective numerical methods for solving TSFFPE are still in their infancy. We present three computationally efficient numerical methods to deal with the RSFD, and approximate the Riemann–Liouville time-fractional derivative using the Grünwald method. The TSFFPE is then transformed into a system of ordinary differential equations (ODE), which is solved by the fractional implicit trapezoidal method (FITM). Finally, numerical results are given to demonstrate the effectiveness of these methods. These techniques can also be applied to solve other types of fractional partial differential equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigations into the biochemical markers associated with executive function (EF) impairment in children with early and continuously treated phenylketonuria (ECT-PKU) remain largely phenylalanine-only focused, despite experimental data showing that a high phenylalanine:tyrosine (phe:tyr) ratio is more strongly associated with EF deficit than phe alone. A high phe:tyr ratio is hypothesized to lead to a reduction in dopamine synthesis within the brain, which in turn results in the development of EF impairment. This paper provides a snapshot of current practice in the monitoring and/or treatment of tyrosine levels in children with PKU, across 12 countries from Australasia, North America and Europe. Tyrosine monitoring in this population has increased over the last 5 years, with over 80% of clinics surveyed reporting routine monitoring of tyrosine levels in infancy alongside phe levels. Twenty-five percent of clinics surveyed reported actively treating/managing tyrosine levels (with supplemental tyrosine above that contained in PKU formulas) to ensure tyrosine levels remain within normal ranges. Anecdotally, supplemental tyrosine has been reported to ameliorate symptoms of both attention deficit hyperactivity disorder and depression in this population. EF assessment of children with ECT-PKU was likewise highly variable, with 50% of clinics surveyed reporting routine assessments of intellectual function. However when function was assessed, test instruments chosen tended towards global measures of IQ prior to school entry, rather than specific assessment of EF development. Further investigation of the role of tyrosine and its relationship with phe and EF development is needed to establish whether routine tyrosine monitoring and increased supplementation is recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With regard to the long-standing problem of the semantic gap between low-level image features and high-level human knowledge, the image retrieval community has recently shifted its emphasis from low-level features analysis to high-level image semantics extrac- tion. User studies reveal that users tend to seek information using high-level semantics. Therefore, image semantics extraction is of great importance to content-based image retrieval because it allows the users to freely express what images they want. Semantic content annotation is the basis for semantic content retrieval. The aim of image anno- tation is to automatically obtain keywords that can be used to represent the content of images. The major research challenges in image semantic annotation are: what is the basic unit of semantic representation? how can the semantic unit be linked to high-level image knowledge? how can the contextual information be stored and utilized for image annotation? In this thesis, the Semantic Web technology (i.e. ontology) is introduced to the image semantic annotation problem. Semantic Web, the next generation web, aims at mak- ing the content of whatever type of media not only understandable to humans but also to machines. Due to the large amounts of multimedia data prevalent on the Web, re- searchers and industries are beginning to pay more attention to the Multimedia Semantic Web. The Semantic Web technology provides a new opportunity for multimedia-based applications, but the research in this area is still in its infancy. Whether ontology can be used to improve image annotation and how to best use ontology in semantic repre- sentation and extraction is still a worth-while investigation. This thesis deals with the problem of image semantic annotation using ontology and machine learning techniques in four phases as below. 1) Salient object extraction. A salient object servers as the basic unit in image semantic extraction as it captures the common visual property of the objects. Image segmen- tation is often used as the �rst step for detecting salient objects, but most segmenta- tion algorithms often fail to generate meaningful regions due to over-segmentation and under-segmentation. We develop a new salient object detection algorithm by combining multiple homogeneity criteria in a region merging framework. 2) Ontology construction. Since real-world objects tend to exist in a context within their environment, contextual information has been increasingly used for improving object recognition. In the ontology construction phase, visual-contextual ontologies are built from a large set of fully segmented and annotated images. The ontologies are composed of several types of concepts (i.e. mid-level and high-level concepts), and domain contextual knowledge. The visual-contextual ontologies stand as a user-friendly interface between low-level features and high-level concepts. 3) Image objects annotation. In this phase, each object is labelled with a mid-level concept in ontologies. First, a set of candidate labels are obtained by training Support Vectors Machines with features extracted from salient objects. After that, contextual knowledge contained in ontologies is used to obtain the �nal labels by removing the ambiguity concepts. 4) Scene semantic annotation. The scene semantic extraction phase is to get the scene type by using both mid-level concepts and domain contextual knowledge in ontologies. Domain contextual knowledge is used to create scene con�guration that describes which objects co-exist with which scene type more frequently. The scene con�guration is represented in a probabilistic graph model, and probabilistic inference is employed to calculate the scene type given an annotated image. To evaluate the proposed methods, a series of experiments have been conducted in a large set of fully annotated outdoor scene images. These include a subset of the Corel database, a subset of the LabelMe dataset, the evaluation dataset of localized semantics in images, the spatial context evaluation dataset, and the segmented and annotated IAPR TC-12 benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business processes have emerged as a well-respected variable in the design of successful corporations. However, unlike other key managerial variables, such as products and services, customers and employees, physical or digital assets, the conceptualization and management of business processes are in many respects in their infancy. In this book, Jan Recker investigates the notion of quality of business process modeling grammars. His evaluation is based on an ontological-, qualitative-, and quantitative analysis, applied to BPMN, a widely-used business process modeling grammar. His results reveal the ontological shortcomings of BPMN and how these manifest themselves in actual process modeling practice, as well as how they influence the usage behavior of modeling practitioners. More generally, his book constitutes a landmark for empirical technology assessment, analyzing the way in which design flaws in technology influence usage behavior.