987 resultados para graphical methods
Resumo:
Sociological approaches to inquiry on emotion in educational settings are growing. Despite a long tradition of research and theory in disciplines such as psychology and sociology, the methods and approaches for naturalistic investigation of emotion are in a developmental phase in educational settings. In this article, recent empirical studies on emotion in educational contexts are canvassed. The discussion focuses on the use of multiple methods within research conducted in high school and university classrooms highlighting recent methodological progress. The methods discussed include facial expression analysis, verbal and non-verbal conduct, and self-report methods. Analyses drawn from different studies, informed by perspectives from microsociology, highlight the strengths and limitations of any one method. The power and limitations of multi-method approaches is discussed.
Resumo:
These lecture notes describe the use and implementation of a framework in which mathematical as well as engineering optimisation problems can be analysed. The foundations of the framework and algorithms described -Hierarchical Asynchronous Parallel Evolutionary Algorithms (HAPEAs) - lie upon traditional evolution strategies and incorporate the concepts of a multi-objective optimisation, hierarchical topology, asynchronous evaluation of candidate solutions , parallel computing and game strategies. In a step by step approach, the numerical implementation of EAs and HAPEAs for solving multi criteria optimisation problems is conducted providing the reader with the knowledge to reproduce these hand on training in his – her- academic or industrial environment.
Resumo:
These lecture notes highlight some of the recent applications of multi-objective and multidisciplinary design optimisation in aeronautical design using the framework and methodology described in References 8, 23, 24 and in Part 1 and 2 of the notes. A summary of the methodology is described and the treatment of uncertainties in flight conditions parameters by the HAPEAs software and game strategies is introduced. Several test cases dealing with detailed design and computed with the software are presented and results discussed in section 4 of these notes.
Resumo:
The oxides of copper (CuxO) are fascinating materials due to their remarkable optical, electrical, thermal and magnetic properties. Nanostructuring of CuxO can further enhance the performance of this important functional material and provide it with unique properties that do not exist in its bulk form. Three distinctly different phases of CuxO, mainly CuO, Cu2O and Cu4O3, can be prepared by numerous synthesis techniques including, vapour deposition and liquid phase chemical methods. In this article, we present a review of nanostructured CuxO focusing on their material properties, methods of synthesis and an overview of various applications that have been associated with nanostructured CuxO.
Resumo:
Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.
Resumo:
Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.
Resumo:
Introduction This investigation aimed to assess the consistency and accuracy of radiation therapists (RTs) performing cone beam computed tomography (CBCT) alignment to fiducial markers (FMs) (CBCTFM) and the soft tissue prostate (CBCTST). Methods Six patients receiving prostate radiation therapy underwent daily CBCTs. Manual alignment of CBCTFM and CBCTST was performed by three RTs. Inter-observer agreement was assessed using a modified Bland–Altman analysis for each alignment method. Clinically acceptable 95% limits of agreement with the mean (LoAmean) were defined as ±2.0 mm for CBCTFM and ±3.0 mm for CBCTST. Differences between CBCTST alignment and the observer-averaged CBCTFM (AvCBCTFM) alignment were analysed. Clinically acceptable 95% LoA were defined as ±3.0 mm for the comparison of CBCTST and AvCBCTFM. Results CBCTFM and CBCTST alignments were performed for 185 images. The CBCTFM 95% LoAmean were within ±2.0 mm in all planes. CBCTST 95% LoAmean were within ±3.0 mm in all planes. Comparison of CBCTST with AvCBCTFM resulted in 95% LoA of −4.9 to 2.6, −1.6 to 2.5 and −4.7 to 1.9 mm in the superior–inferior, left–right and anterior–posterior planes, respectively. Conclusions Significant differences were found between soft tissue alignment and the predicted FM position. FMs are useful in reducing inter-observer variability compared with soft tissue alignment. Consideration needs to be given to margin design when using soft tissue matching due to increased inter-observer variability. This study highlights some of the complexities of soft tissue guidance for prostate radiation therapy.
Resumo:
Optimisation is a fundamental step in the turbine design process, especially in the development of non-classical designs of radial-inflow turbines working with high-density fluids in low-temperature Organic Rankine Cycles (ORCs). The present work discusses the simultaneous optimisation of the thermodynamic cycle and the one-dimensional design of radial-inflow turbines. In particular, the work describes the integration between a 1D meanline preliminary design code adapted to real gases and the performance estimation approach for radial-inflow turbines in an established ORC cycle analysis procedure. The optimisation approach is split in two distinct loops; the inner operates on the 1D design based on the parameters received from the outer loop, which optimises the thermodynamic cycle. The method uses parameters including brine flow rate, temperature and working fluid, shifting assumptions such as head and flow coefficients into the optimisation routine. The discussed design and optimisation method is then validated against published benchmark cases. Finally, using the same conditions, the coupled optimisation procedure is extended to the preliminary design of a radial-inflow turbine with R143a as working fluid in realistic geothermal conditions and compared against results from commercially-available software RITAL from Concepts-NREC.
Resumo:
Purpose To establish whether the use of a passive or active technique of planning target volume (PTV) definition and treatment methods for non-small cell lung cancer (NSCLC) deliver the most effective results. This literature review assesses the advantages and disadvantages in recent studies of each, while assessing the validity of the two approaches for planning and treatment. Methods A systematic review of literature focusing on the planning and treatment of radiation therapy to NSCLC tumours. Different approaches which have been published in recent articles are subjected to critical appraisal in order to determine their relative efficacy. Results Free-breathing (FB) is the optimal method to perform planning scans for patients and departments, as it involves no significant increase in cost, workload or education. Maximum intensity projection (MIP) is the fastest form of delineation, however it is noted to be less accurate than the ten-phase overlap approach for computed tomography (CT). Although gating has proven to reduce margins and facilitate sparing of organs at risk, treatment times can be longer and planning time can be as much as 15 times higher for intensity modulated radiation therapy (IMRT). This raises issues with patient comfort and stabilisation, impacting on the chance of geometric miss. Stereotactic treatments can take up to 3 hours to treat, along with increases in planning and treatment, as well as the additional hardware, software and training required. Conclusion Four-dimensional computed tomography (4DCT) is superior to 3DCT, with the passive FB approach for PTV delineation and treatment optimal. Departments should use a combination of MIP with visual confirmation ensuring coverage for stage 1 disease. Stages 2-3 should be delineated using ten-phases overlaid. Stereotactic and gated treatments for early stage disease should be used accordingly; FB-IMRT is optimal for latter stage disease.
Resumo:
The Environmental Kuznets Curve (EKC) hypothesises an inverse U-shaped relationship between a measure of environmental pollution and per capita income levels. In this study, we apply non-parametric estimation of local polynomial regression (local quadratic fitting) to allow more flexibility in local estimation. This study uses a larger and globally representative sample of many local and global pollutants and natural resources including Biological Oxygen Demand (BOD) emission, CO2 emission, CO2 damage, energy use, energy depletion, mineral depletion, improved water source, PM10, particulate emission damage, forest area and net forest depletion. Copyright © 2009 Inderscience Enterprises Ltd.
Resumo:
We implemented six different boarding strategies (Wilma, Steffen, Reverse Pyramid, Random, Blocks and By letter) in order to investigate boarding times for Boeing 777 and Airbus 380 aircraft. We also introduce three new boarding methods to find the optimum boarding strategy. Our models explicitly simulate the behaviour of groups of people travelling together and we explicitly simulate the timing to store their luggage as part of the boarding process. Results from the simulation demonstrates the Reverse Pyramid method is the best boarding method for Boeing 777, and the Steffen method is the best boarding method for Airbus 380. For the new suggested boarding methods, aisle first boarding method is the best boarding strategy for Boeing 777 and row arrangement method is the best boarding strategy for Airbus 380. Overall best boarding strategy is aisle first boarding method for Boeing 777 and Steffen method for Airbus 380.
Resumo:
Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression
Resumo:
Plant food materials have a very high demand in the consumer market and therefore, improved food products and efficient processing techniques are concurrently being researched in food engineering. In this context, numerical modelling and simulation techniques have a very high potential to reveal fundamentals of the underlying mechanisms involved. However, numerical modelling of plant food materials during drying becomes quite challenging, mainly due to the complexity of the multiphase microstructure of the material, which undergoes excessive deformations during drying. In this regard, conventional grid-based modelling techniques have limited applicability due to their inflexible grid-based fundamental limitations. As a result, meshfree methods have recently been developed which offer a more adaptable approach to problem domains of this nature, due to their fundamental grid-free advantages. In this work, a recently developed meshfree based two-dimensional plant tissue model is used for a comparative study of microscale morphological changes of several food materials during drying. The model involves Smoothed Particle Hydrodynamics (SPH) and Discrete Element Method (DEM) to represent fluid and solid phases of the cellular structure. Simulation are conducted on apple, potato, carrot and grape tissues and the results are qualitatively and quantitatively compared and related with experimental findings obtained from the literature. The study revealed that cellular deformations are highly sensitive to cell dimensions, cell wall physical and mechanical properties, middle lamella properties and turgor pressure. In particular, the meshfree model is well capable of simulating critically dried tissues at lower moisture content and turgor pressure, which lead to cell wall wrinkling. The findings further highlighted the potential applicability of the meshfree approach to model large deformations of the plant tissue microstructure during drying, providing a distinct advantage over the state of the art grid-based approaches.