79 resultados para Process analysis
Resumo:
Background: There is general agreement across all interested parties that a process of working together is the best way to determine which school or educational setting is right for an individual child with autism spectrum disorder. In the UK, families and local authorities both desire a constructive working relationship and see this as the best means by which to reach an agreement to determine where a child should be educated. It has been shown in published works 1 1. Batten and colleagues (Make schools make sense. Autism and education: the reality for families today; London: The National Autistic Society, 2006). View all notes that a constructive working relationship is not always achieved. Purpose: This small-scale study aims to explore the views of both parents and local authorities, focussing on how both parties perceive and experience the process of determining educational provision for children with autism spectrum disorders (ASD) within an English context. Sample, design and method: Parental opinion was gathered through the use of a questionnaire with closed and open responses. The questionnaire was distributed to two national charities, two local charities and 16 specialist schools, which offered the questionnaire to parents of children with ASD, resulting in an opportunity sample of 738 returned surveys. The views of local authority personnel from five local authorities were gathered through the use of semi-structured interviews. Data analyses included quantitative analysis of the closed response questionnaire items, and theme-based qualitative analysis of the open responses and interviews with local authority personnel. Results: In the majority of cases, parents in the survey obtained their first choice placement for their child. Despite this positive outcome, survey data indicated that parents found the process bureaucratic, stressful and time consuming. Parents tended to perceive alternative placement suggestions as financially motivated rather than in the best interests of the child. Interviews with local authority personnel showed an awareness of these concerns and the complex considerations involved in determining what is best for an individual child. Conclusions: This small-scale study highlights the need for more effective communication between parents of children with ASDs and local authority personnel at all stages of the process
Resumo:
The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.
Resumo:
We provide a unified framework for a range of linear transforms that can be used for the analysis of terahertz spectroscopic data, with particular emphasis on their application to the measurement of leaf water content. The use of linear transforms for filtering, regression, and classification is discussed. For illustration, a classification problem involving leaves at three stages of drought and a prediction problem involving simulated spectra are presented. Issues resulting from scaling the data set are discussed. Using Lagrange multipliers, we arrive at the transform that yields the maximum separation between the spectra and show that this optimal transform is equivalent to computing the Euclidean distance between the samples. The optimal linear transform is compared with the average for all the spectra as well as with the Karhunen–Loève transform to discriminate a wet leaf from a dry leaf. We show that taking several principal components into account is equivalent to defining new axes in which data are to be analyzed. The procedure shows that the coefficients of the Karhunen–Loève transform are well suited to the process of classification of spectra. This is in line with expectations, as these coefficients are built from the statistical properties of the data set analyzed.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.
Resumo:
In this paper we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Since the early work by Geltner (1989), many papers have been written on this topic but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraised-based index. To investigate this issue in more detail we analyse a sample of individual property level appraisal data from the Investment Property Database (IPD). We find that commonly used unsmoothing estimates overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns.
Resumo:
Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.
Resumo:
In this article, we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Many articles have been written on appraisal smoothing but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraisal-based index. To investigate this issue we analyze a large sample of appraisal data at the individual property level from the Investment Property Databank. We find that commonly used unsmoothing estimates at the index level overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns at the index level and an ARMA model at the individual property level.
Resumo:
Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.
Resumo:
Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002–2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a Bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.
Resumo:
The themes of awareness and influence within the innovation diffusion process are addressed. The innovation diffusion process is typically represented as stages, yet awareness and influence are somewhat under-represented in the literature. Awareness and influence are situated within the contextual setting of individual actors but also within the broader institutional forces. Understanding how actors become aware of an innovation and then how their opinion is influenced is important for creating a more innovation-active UK construction sector. Social network analysis is proposed as one technique for mapping how awareness and influence occur and what they look like as a network. Empirical data are gathered using two modes of enquiry. This is done through a pilot study consisting of chartered professionals and then through a case study organization as it attempted to diffuse an innovation. The analysis demonstrates significant variations across actors’ awareness and influence networks. It is argued that social network analysis can complement other research methods in order to present a richer picture of how actors become aware of innovations and where they draw their influences regarding adopting innovations. In summarizing the findings, a framework for understanding awareness and influence associated with innovation within the UK construction sector is presented. Finally, with the UK construction sector continually being encouraged to be innovative, understanding and managing an actor’s awareness and influence network will be beneficial. The overarching conclusion thus describes the need not only to build research capacity in this area but also to push the boundaries related to the research methods employed.
Resumo:
Increasingly, corporate occupiers seek more flexible ways of meeting their accommodation needs. One consequence of this process has been the growth of the executive suite, serviced office or business centre market. This paper, the final report of a research project funded by the Real Estate Research Institute, focuses upon the geographical distribution of business centers offering executive suites within the US. After a brief review of the development of the market, the paper examines the availability of data, provides basic descriptive statistics of the distribution of executive suites by state and by metropolitan statistical area and then attempts to model the distribution using demographic and socio-economic data at MSA level. The distribution reflects employment in key growth sectors and the position of the MSA in the urban hierarchy. An appendix presents a preliminary view of the global distribution of suites.
Resumo:
Modern Portfolio Theory (MPT) has been advocated as a more rational approach to the construction of real estate portfolios. The application of MPT can now be achieved with relative ease using the powerful facilities of modern spreadsheet, and does not necessarily need specialist software. This capability is to be found in the use of an add-in Tool now found in several spreadsheets, called an Optimiser or Solver. The value in using this kind of more sophisticated analysis feature of spreadsheets is increasingly difficult to ignore. This paper examines the use of the spreadsheet Optimiser in handling asset allocation problems. Using the Markowitz Mean-Variance approach, the paper introduces the necessary calculations, and shows, by means of an elementary example implemented in Microsoft's Excel, how the Optimiser may be used. Emphasis is placed on understanding the inputs and outputs from the portfolio optimisation process, and the danger of treating the Optimiser as a Black Box is discussed.
Resumo:
In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes.
Resumo:
We analyze the large time behavior of a stochastic model for the lay down of fibers on a moving conveyor belt in the production process of nonwovens. It is shown that under weak conditions this degenerate diffusion process has a unique invariant distribution and is even geometrically ergodic. This generalizes results from previous works [M. Grothaus and A. Klar, SIAM J. Math. Anal., 40 (2008), pp. 968–983; J. Dolbeault et al., arXiv:1201.2156] concerning the case of a stationary conveyor belt, in which the situation of a moving conveyor belt has been left open.
Resumo:
In order to shed light on the collective behavior of social insects, we analyzed the behavior of ants from single to multi-body. In an experimental set-up, ants are placed in hemisphere without a nest and food. Trajectory of ants is recorded. From this bottom-up approach, we found that collective behavior of ants as follows: 1. Activity of single ant increases and decreases periodically. 2. Spontaneous meeting process is observed between two ants and meeting spot of two ants is localized in hemisphere. 3. Result on division of labor is obtained between two ants.