882 resultados para large scale data gathering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When machining a large-scale aerospace part, the part is normally located and clamped firmly until a set of features are machined. When the part is released, its size and shape may deform beyond the tolerance limits due to stress release. This paper presents the design of a new fixing method and flexible fixtures that would automatically respond to workpiece deformation during machining. Deformation is inspected and monitored on-line, and part location and orientation can be adjusted timely to ensure follow-up operations are carried out under low stress and with respect to the related datum defined in the design models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this investigation was to develop new techniques to generate segmental assessments of body composition based on Segmental Bioelectrical Impedance Analysis (SBIA). An equally important consideration was the design, simulation, development, and the software and hardware integration of the SBIA system. This integration was carried out with a Very Large Scale Integration (VLSI) Field Programmable Gate Array (FPGA) microcontroller that analyzed the measurements obtained from segments of the body, and provided full body and segmental Fat Free Mass (FFM) and Fat Mass (FM) percentages. Also, the issues related to the estimate of the body's composition in persons with spinal cord injury (SCI) were addressed and investigated. This investigation demonstrated that the SBIA methodology provided accurate segmental body composition measurements. Disabled individuals are expected to benefit from these SBIA evaluations, as they are non-invasive methods, suitable for paralyzed individuals. The SBIA VLSI system may replace bulky, non flexible electronic modules attached to human bodies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet Protocol Television (IPTV) is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure. There is considerable confusion and concern about the IPTV, since two different technologies have to be mended together to provide the end customers with some thing better than the conventional television. In this research, functional architecture of the IPTV system was investigated. Very Large Scale Integration based system for streaming server controller were designed and different ways of hosting a web server which can be used to send the control signals to the streaming server controller were studied. The web server accepts inputs from the keyboard and FPGA board switches and depending on the preset configuration the server will open a selected web page and also sends the control signals to the streaming server controller. It was observed that the applications run faster on PowerPC since it is embedded into the FPGA. Commercial market and Global deployment of IPTV were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a conceptual ecological model (CEM) for invasive species to help understand the role invasive exotics have in ecosystem ecology and their impacts on restoration activities. Our model, which can be applied to any invasive species, grew from the eco-regional conceptual models developed for Everglades restoration. These models identify ecological drivers, stressors, effects and attributes; we integrated the unique aspects of exotic species invasions and effects into this conceptual hierarchy. We used the model to help identify important aspects of invasion in the development of an invasive exotic plant ecological indicator, which is described a companion paper in this special issue journal. A key aspect of the CEM is that it is a general ecological model that can be tailored to specific cases and species, as the details of any invasion are unique to that invasive species. Our model encompasses the temporal and spatial changes that characterize invasion, identifying the general conditions that allow a species to become invasive in a de novo environment; it then enumerates the possible effects exotic species may have collectively and individually at varying scales and for different ecosystem properties, once a species becomes invasive. The model provides suites of characteristics and processes, as well as hypothesized causal relationships to consider when thinking about the effects or potential effects of an invasive exotic and how restoration efforts will affect these characteristics and processes. In order to illustrate how to use the model as a blueprint for applying a similar approach to other invasive species and ecosystems, we give two examples of using this conceptual model to evaluate the status of two south Florida invasive exotic plant species (melaleuca and Old World climbing fern) and consider potential impacts of these invasive species on restoration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funded by The research presented in this paper is part of the SINBAD project. Grant Number: STW (12058) and EPSRC (EP/J00507X/1, EP/J005541/1)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this work is to determine the true cost incurred by the Republic of Ireland and Northern Ireland in order to meet their EU renewable electricity targets. The primary all-island of Ireland policy goal is that 40% of electricity will come from renewable sources in 2020. From this it is expected that wind generation on the Irish electricity system will be in the region of 32-37% of total generation. This leads to issues resulting from wind energy being a non-synchronous, unpredictable and variable source of energy use on a scale never seen before for a single synchronous system. If changes are not made to traditional operational practices, the efficient running of the electricity system will be directly affected by these issues in the coming years. Using models of the electricity system for the all-island grid of Ireland, the effects of high wind energy penetration expected to be present in 2020 are examined. These models were developed using a unit commitment, economic dispatch tool called PLEXOS which allows for a detailed representation of the electricity system to be achieved down to individual generator level. These models replicate the true running of the electricity system through use of day-ahead scheduling and semi-relaxed use of these schedules that reflects the Transmission System Operator's of real time decision making on dispatch. In addition, it carefully considers other non-wind priority dispatch generation technologies that have an effect on the overall system. In the models developed, three main issues associated with wind energy integration were selected to be examined in detail to determine the sensitivity of assumptions presented in other studies. These three issues include wind energy's non-synchronous nature, its variability and spatial correlation, and its unpredictability. This leads to an examination of the effects in three areas: the need for system operation constraints required for system security; different onshore to offshore ratios of installed wind energy; and the degrees of accuracy in wind energy forecasting. Each of these areas directly impact the way in which the electricity system is run as they address each of the three issues associated with wind energy stated above, respectively. It is shown that assumptions in these three areas have a large effect on the results in terms of total generation costs, wind curtailment and generator technology type dispatch. In particular accounting for these issues has resulted in wind curtailment being predicted in much larger quantities than had been previously reported. This would have a large effect on wind energy companies because it is already a very low profit margin industry. Results from this work have shown that the relaxation of system operation constraints is crucial to the economic running of the electricity system with large improvements shown in the reduction of wind curtailment and system generation costs. There are clear benefits in having a proportion of the wind installed offshore in Ireland which would help to reduce variability of wind energy generation on the system and therefore reduce wind curtailment. With envisaged future improvements in day-ahead wind forecasting from 8% to 4% mean absolute error, there are potential reductions in wind curtailment system costs and open cycle gas turbine usage. This work illustrates the consequences of assumptions in the areas of system operation constraints, onshore/offshore installed wind capacities and accuracy in wind forecasting to better inform the true costs associated with running Ireland's changing electricity system as it continues to decarbonise into the near future. This work also proposes to illustrate, through the use of Ireland as a case study, the effects that will become ever more prevalent in other synchronous systems as they pursue a path of increasing renewable energy generation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of this thesis is to explore and quantify the response of large-scale solid mass transfer events on satellite-based gravity observations. The gravity signature of large-scale solid mass transfers has not been deeply explored yet; mainly due to the lack of significant events during dedicated satellite gravity missions‘ lifespans. In light of the next generation of gravity missions, the feasibility of employing satellite gravity observations to detect submarine and surface mass transfers is of importance for geoscience (improves the understanding of geodynamic processes) and for geodesy (improves the understanding of the dynamic gravity field). The aim of this thesis is twofold and focuses on assessing the feasibility of using satellite gravity observations for detecting large-scale solid mass transfers and on modeling the impact on the gravity field caused by these events. A methodology that employs 3D forward modeling simulations and 2D wavelet multiresolution analysis is suggested to estimate the impact of solid mass transfers on satellite gravity observations. The gravity signature of various submarine and subaerial events that occurred in the past was estimated. Case studies were conducted to assess the sensitivity and resolvability required in order to observe gravity differences caused by solid mass transfers. Simulation studies were also employed in order to assess the expected contribution of the Next Generation of Gravity Missions for this application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Quantitative and accurate measurements of fat and muscle in the body are important for prevention and diagnosis of diseases related to obesity and muscle degeneration. Manually segmenting muscle and fat compartments in MR body-images is laborious and time-consuming, hindering implementation in large cohorts. In the present study, the feasibility and success-rate of a Dixon-based MR scan followed by an intensity-normalised, non-rigid, multi-atlas based segmentation was investigated in a cohort of 3,000 subjects. Materials and Methods 3,000 participants in the in-depth phenotyping arm of the UK Biobank imaging study underwent a comprehensive MR examination. All subjects were scanned using a 1.5 T MR-scanner with the dual-echo Dixon Vibe protocol, covering neck to knees. Subjects were scanned with six slabs in supine position, without localizer. Automated body composition analysis was performed using the AMRA Profiler™ system, to segment and quantify visceral adipose tissue (VAT), abdominal subcutaneous adipose tissue (ASAT) and thigh muscles. Technical quality assurance was performed and a standard set of acceptance/rejection criteria was established. Descriptive statistics were calculated for all volume measurements and quality assurance metrics. Results Of the 3,000 subjects, 2,995 (99.83%) were analysable for body fat, 2,828 (94.27%) were analysable when body fat and one thigh was included, and 2,775 (92.50%) were fully analysable for body fat and both thigh muscles. Reasons for not being able to analyse datasets were mainly due to missing slabs in the acquisition, or patient positioned so that large parts of the volume was outside of the field-of-view. Discussion and Conclusions In conclusion, this study showed that the rapid UK Biobank MR-protocol was well tolerated by most subjects and sufficiently robust to achieve very high success-rate for body composition analysis. This research has been conducted using the UK Biobank Resource.