928 resultados para Boundary Inhomogeneity Method
Resumo:
We present an iterative hierarchical algorithm for multi-view stereo. The algorithm attempts to utilise as much contextual information as is available to compute highly accurate and robust depth maps. There are three novel aspects to the approach: 1) firstly we incrementally improve the depth fidelity as the algorithm progresses through the image pyramid; 2) secondly we show how to incorporate visual hull information (when available) to constrain depth searches; and 3) we show how to simultaneously enforce the consistency of the depth-map by continual comparison with neighbouring depth-maps. We show that this approach produces highly accurate depth-maps and, since it is essentially a local method, is both extremely fast and simple to implement.
Resumo:
This paper presents a strategy for delayed research method selection in a qualitative interpretivist research. An exemplary case details how explorative interviews were designed and conducted in accordance with a paradigm prior to deciding whether to adopt grounded theory or phenomenology for data analysis. The focus here is to determine the most appropriate research strategy in this case the methodological framing to conduct research and represent findings, both of which are detailed. Research addressing current management issues requires both a flexible framework and the capability to consider the research problem from various angles, to derive tangible results for academia with immediate application to business demands. Researchers, and in particular novices, often struggle to decide on an appropriate research method suitable to address their research problem. This often applies to interpretative qualitative research where it is not always immediately clear which is the most appropriate method to use, as the research objectives shift and crystallize over time. This paper uses an exemplary case to reveal how the strategy for delayed research method selection contributes to deciding whether to adopt grounded theory or phenomenology in the initial phase of a PhD research project. In this case, semi-structured interviews were used for data generation framed in an interpretivist approach, situated in a business context. Research questions for this study were thoroughly defined and carefully framed in accordance with the research paradigm‟s principles, while at the same time ensuring that the requirements of both potential research methods were met. The grounded theory and phenomenology methods were compared and contrasted to determine their suitability and whether they meet the research objectives based on a pilot study. The strategy proposed in this paper is an alternative to the more „traditional‟ approach, which initially selects the methodological formulation, followed by data generation. In conclusion, the suggested strategy for delayed research method selection intends to help researchers identify and apply the most appropriate method to their research. This strategy is based on explorations of data generation and analysis in order to derive faithful results from the data generated.
Resumo:
The introduction of the Australian curriculum, the use of standardised testing (e.g. NAPLAN) and the My School website are couched in a context of accountability. This circumstance has stimulated and in some cases renewed a range of boundaries in Australian Education. The consequences that arise from standardised testing have accentuated the boundaries produced by social reproduction in education which has led to an increase in the numbers of students disengaging from mainstream education and applying for enrolment at the Edmund Rice Education Australia Flexible Learning Centre Network (EREAFLCN). Boundaries are created for many young people who are denied access to credentials and certification as a result of being excluded from or in some way disengaging from standardised education and testing. Young people who participate at the EREAFLCN arrive with a variety of forms of cultural capital that are not valued in current education and employment fields. This is not to say that these young people’s different forms of cultural capital have no value, but rather that such funds of knowledge, repertoires and cultural capital are not valued by the majority of powerful agents in educational and employment fields. How then can the qualitative value of traditionally unorthodox - yet often intricate, ingenious, and astute - versions of cultural capital evident in the habitus of many young people be made to count, be recognised, be valuated? Can a process of educational assessment be a field of capital exchange and a space which breaches boundaries through a valuating process? This paper reports on the development of an innovative approach to assessment in an alternative education institution designed for the re-engagement of ‘at risk’ youth who have left formal schooling. A case study approach has been used to document the engagement of six young people, with an educational approach described as assessment for learning as a field of exchange across two sites in the EREAFLCN. In order to capture the broad range of students’ cultural and social capital, an electronic portfolio system (EPS) is under trial. The model draws on categories from sociological models of capital and reconceptualises the eportfolio as a sociocultural zone of learning and development. Results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials. In this way restrictive boundaries can be breached and a more equitable outcome achieved for many young Australians.
Resumo:
An improved scaling analysis and direct numerical simulations are performed for the unsteady natural convection boundary layer adjacent to a downward facing inclined plate with uniform heat flux. The development of the thermal or viscous boundary layers may be classified into three distinct stages: a start-up stage, a transitional stage and a steady stage, which can be clearly identified in the analytical as well as the numerical results. Previous scaling shows that the existing scaling laws of the boundary layer thickness, velocity and steady state time scale for the natural convection flow on a heated plate of uniform heat flux provide a very poor prediction of the Prandtl number dependency of the flow. However, those scalings perform very well with Rayleigh number and aspect ratio dependency. In this study, a modified Prandtl number scaling is developed using a triple layer integral approach for Pr > 1. It is seen that in comparison to the direct numerical simulations, the modified scaling performs considerably better than the previous scaling.
Resumo:
The accuracy of marker placement on palpable surface anatomical landmarks is an important consideration in biomechanics. Although marker placement reliability has been studied in some depth, it remains unclear whether or not the markers are accurately positioned over the intended landmark in order to define the static position and orientation of the segment. A novel method using commonly available X-ray imaging was developed to identify the accuracy of markers placed on the shoe surface by palpating landmarks through the shoe. An anterior–posterior and lateral–medial X-ray was taken on 24 participants with a newly developed marker set applied to both the skin and shoe. The vector magnitude of both skin- and shoe-mounted markers from the anatomical landmark was calculated, as well as the mean marker offset between skin- and shoe-mounted markers. The accuracy of placing markers on the shoe relative to the skin-mounted markers, accounting for shoe thickness, was less than 5mm for all markers studied. Further, when using the developed guidelines provided in this study, the method was deemed reliable (Intra-rater ICCs¼0.50–0.92). In conclusion, the method proposed here can reliably assess marker placement accuracy on the shoe surface relative to chosen anatomical landmarks beneath the skin.
Resumo:
In recent years, the advent of new tools for musculoskeletal simulation has increased the potential for significantly improving the ergonomic design process and ergonomic assessment of design. In this paper we investigate the use of one such tool, ‘The AnyBody Modeling System’, applied to solve a one-parameter and yet, complex ergonomic design problem. The aim of this paper is to investigate the potential of computer-aided musculoskeletal modelling in the ergonomic design process, in the same way as CAE technology has been applied to engineering design.
Resumo:
This paper describes the formulation for the free vibration of joined conical-cylindrical shells with uniform thickness using the transfer of influence coefficient for identification of structural characteristics. These characteristics are importance for structural health monitoring to develop model. This method was developed based on successive transmission of dynamic influence coefficients, which were defined as the relationships between the displacement and the force vectors at arbitrary nodal circles of the system. The two edges of the shell having arbitrary boundary conditions are supported by several elastic springs with meridional/axial, circumferential, radial and rotational stiffness, respectively. The governing equations of vibration of a conical shell, including a cylindrical shell, are written as a coupled set of first order differential equations by using the transfer matrix of the shell. Once the transfer matrix of a single component has been determined, the entire structure matrix is obtained by the product of each component matrix and the joining matrix. The natural frequencies and the modes of vibration were calculated numerically for joined conical-cylindrical shells. The validity of the present method is demonstrated through simple numerical examples, and through comparison with the results of previous researchers.
Resumo:
Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.
Resumo:
When used as floor joists, the new mono-symmetric LiteSteel beam (LSB) sections require web openings to provide access for inspections and various services. The LSBs consist of two rectangular hollow flanges connected by a slender web, and are subjected to lateral distortional buckling effects in the intermediate span range. Their member capacity design formulae developed to date are based on their elastic lateral buckling moments, and only limited research has been undertaken to predict the elastic lateral buckling moments of LSBs with web openings. This paper addresses this research gap by reporting the development of web opening modelling techniques based on an equivalent reduced web thickness concept and a numerical method for predicting the elastic buckling moments of LSBs with circular web openings. The proposed numerical method was based on a formulation of the total potential energy of LSBs with circular web openings. The accuracy of the proposed method’s use with the aforementioned modelling techniques was verified through comparison of its results with those of finite strip and finite element analyses of various LSBs.
Resumo:
The LiteSteel Beam (LSB) is an innovative cold-formed steel hollow flange section. When used as floor joists, the LSB sections require holes in the web to provide access for various services. In this study a detailed investigation was undertaken into the elastic lateral distortional buckling behaviour of LSBs with circular web openings subjected to a uniform moment using finite element analysis. Validated ideal finite element models were used first to study the effect of web holes on their elastic lateral distortional buckling behaviour. An equivalent web thickness method was then proposed using four different equations for the elastic buckling analyses of LSBs with web holes. It was found that two of them could be successfully used with approximate numerical models based on solid web elements with an equivalent reduced thickness to predict the elastic lateral distortional buckling moments.
Resumo:
The paper introduces the underlying principles and the general features of a meta-method (MAP method) developed as part of and used in various research, education and professional development programmes at ESC Lille. This method aims at providing effective and efficient structure and process for acting and learning in various complex, uncertain and ambiguous managerial situations (projects, programmes, portfolios). The paper is developed around three main parts. First, I suggest revisiting the dominant vision of the project management knowledge field, based on the assumptions they are not addressing adequately current business and management contexts and situations, and that competencies in management of entrepreneurial activities are the sources of creation of value for organisations. Then, grounded on the former developments, I introduce the underlying concepts supporting MAP method seen as a ‘convention generator’ and how this meta method inextricably links learning and practice in addressing managerial situations. Finally, I briefly describe an example of application, illustrating with a case study how the method integrates Project Management Governance, and give few examples of use in Management Education and Professional Development.
Resumo:
The paper introduces the underlying principles and the general features of a meta-method (MAP method – Management & Analysis of Projects) developed as part of and used in various research, education and professional development programmes at ESC Lille. This method aims at providing effective and efficient structure and process for acting and learning in various complex, uncertain and ambiguous managerial situations (projects, programmes, portfolios). The paper is organized in three parts. In a first part, I propose to revisit the dominant vision of the project management knowledge field, based on the assumptions they are not addressing adequately current business and management contexts and situations, and that competencies in management of entrepreneurial activities are the sources of creation of value for organisations. Then, grounded on the new suggested perspective, the second part presents the underlying concepts supporting MAP method seen as a ‘convention generator' and how this meta-method inextricably links learning and practice in addressing managerial situations. The third part describes example of application, illustrating with a brief case study how the method integrates Project Management Governance, and gives few examples of use in Management Education and Professional Development.
Resumo:
The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost which demand of continuous improvements techniques. In this paper, we propose a fuzzy based performance evaluation method for lean supply chain. To understand the overall performance of cost competitive supply chain, we investigate the alignment of market strategy and position of the supply chain. Competitive strategies can be achieved by using a different weight calculation for different supply chain situations. By identifying optimal performance metrics and applying performance evaluation methods, managers can predict the overall supply chain performance under lean strategy.
Resumo:
Monodisperse silica nanoparticles were synthesised by the well-known Stober protocol, then dispersed in acetonitrile (ACN) and subsequently added to a bisacetonitrile gold(I) coordination complex ([Au(MeCN)2]?) in ACN. The silica hydroxyl groups were deprotonated in the presence of ACN, generating a formal negative charge on the siloxy groups. This allowed the [Au(MeCN)2]? complex to undergo ligand exchange with the silica nanoparticles and form a surface coordination complex with reduction to metallic gold (Au0) proceeding by an inner sphere mechanism. The residual [Au(MeCN)2]? complex was allowed to react with water, disproportionating into Au0 and Au(III), respectively, with the Au0 adding to the reduced gold already bound on the silica surface. The so-formed metallic gold seed surface was found to be suitable for the conventional reduction of Au(III) to Au0 by ascorbic acid (ASC). This process generated a thin and uniform gold coating on the silica nanoparticles. The silica NPs batches synthesised were in a size range from 45 to 460 nm. Of these silica NP batches, the size range from 400 to 480 nm were used for the gold-coating experiments.