954 resultados para computer algorithm
Resumo:
We consider a joint relay selection and subcarrier allocation problem that minimizes the total system power for a multi-user, multi-relay and single source cooperative OFDM based two hop system. The system is constrained to all users having a specific subcarrier requirement (user fairness). However no specific fairness constraints for relays are considered. To ensure the optimum power allocation, the subcarriers in two hops are paired with each other. We obtain an optimal subcarrier allocation for the single user case using a similar method to what is described in [1] and modify the algorithm for multiuser scenario. Although the optimality is not achieved in multiuser case the probability of all users being served fairly is improved significantly with a relatively low cost trade off.
Resumo:
In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays(FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.
Resumo:
Rates of dehydration/rehydration are important quality parameters for dried products. Theoretically, if there are no adverse effects on the integrity of the tissue structure, it should absorb water to the same moisture content of the initial product before drying.The purpose of this work is to semi-automate the process of detection of cell structure boundaries as a food is dehydrated and rehydrated. This will enable food materials researchers to quantify changes to material’s structure as these processes take place. Images of potato cells as they were dehydrated and rehydrated were taken using an electron microscope. Cell boundaries were detected using an image processing algorithm. Average cell area and perimeter at each stage of dehydration were calculated and plotted versus time. The results show that the algorithm can successfully identify cell boundaries.
Resumo:
One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.
Resumo:
The rank transform is one non-parametric transform which has been applied to the stereo matching problem The advantages of this transform include its invariance to radio metric distortion and its amenability to hardware implementation. This paper describes the derivation of the rank constraint for matching using the rank transform Previous work has shown that this constraint was capable of resolving ambiguous matches thereby improving match reliability A new matching algorithm incorporating this constraint was also proposed. This paper extends on this previous work by proposing a matching algorithm which uses a dimensional match surface in which the match score is computed for every possible template and match window combination. The principal advantage of this algorithm is that the use of the match surface enforces the left�right consistency and uniqueness constraints thus improving the algorithms ability to remove invalid matches Experimental results for a number of test stereo pairs show that the new algorithm is capable of identifying and removing a large number of in incorrect matches particularly in the case of occlusions
Resumo:
The rank transform is a non-parametric technique which has been recently proposed for the stereo matching problem. The motivation behind its application to the matching problem is its invariance to certain types of image distortion and noise, as well as its amenability to real-time implementation. This paper derives an analytic expression for the process of matching using the rank transform, and then goes on to derive one constraint which must be satisfied for a correct match. This has been dubbed the rank order constraint or simply the rank constraint. Experimental work has shown that this constraint is capable of resolving ambiguous matches, thereby improving matching reliability. This constraint was incorporated into a new algorithm for matching using the rank transform. This modified algorithm resulted in an increased proportion of correct matches, for all test imagery used.
Resumo:
In most visual mapping applications suited to Autonomous Underwater Vehicles (AUVs), stereo visual odometry (VO) is rarely utilised as a pose estimator as imagery is typically of very low framerate due to energy conservation and data storage requirements. This adversely affects the robustness of a vision-based pose estimator and its ability to generate a smooth trajectory. This paper presents a novel VO pipeline for low-overlap imagery from an AUV that utilises constrained motion and integrates magnetometer data in a bi-objective bundle adjustment stage to achieve low-drift pose estimates over large trajectories. We analyse the performance of a standard stereo VO algorithm and compare the results to the modified vo algorithm. Results are demonstrated in a virtual environment in addition to low-overlap imagery gathered from an AUV. The modified VO algorithm shows significantly improved pose accuracy and performance over trajectories of more than 300m. In addition, dense 3D meshes generated from the visual odometry pipeline are presented as a qualitative output of the solution.
Resumo:
To ensure the small-signal stability of a power system, power system stabilizers (PSSs) are extensively applied for damping low frequency power oscillations through modulating the excitation supplied to synchronous machines, and increasing interest has been focused on developing different PSS schemes to tackle the threat of damping oscillations to power system stability. This paper examines four different PSS models and investigates their performances on damping power system dynamics using both small-signal eigenvalue analysis and large-signal dynamic simulations. The four kinds of PSSs examined include the Conventional PSS (CPSS), Single Neuron based PSS (SNPSS), Adaptive PSS (APSS) and Multi-band PSS (MBPSS). A steep descent parameter optimization algorithm is employed to seek the optimal PSS design parameters. To evaluate the effects of these PSSs on improving power system dynamic behaviors, case studies are carried out on an 8-unit 24-bus power system through both small-signal eigenvalue analysis and large-signal time-domain simulations.
Resumo:
With the progressive exhaustion of fossil energy and the enhanced awareness of environmental protection, more attention is being paid to electric vehicles (EVs). Inappropriate siting and sizing of EV charging stations could have negative effects on the development of EVs, the layout of the city traffic network, and the convenience of EVs' drivers, and lead to an increase in network losses and a degradation in voltage profiles at some nodes. Given this background, the optimal sites of EV charging stations are first identified by a two-step screening method with environmental factors and service radius of EV charging stations considered. Then, a mathematical model for the optimal sizing of EV charging stations is developed with the minimization of total cost associated with EV charging stations to be planned as the objective function and solved by a modified primal-dual interior point algorithm (MPDIPA). Finally, simulation results of the IEEE 123-node test feeder have demonstrated that the developed model and method cannot only attain the reasonable planning scheme of EV charging stations, but also reduce the network loss and improve the voltage profile.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.
Resumo:
The computer is fast becoming part of the furniture in many hospital settings. Increasing reliance on the computer for documentation and dissemination of information in patient-care areas has increased the need to consider this equipment as a potential environmental reservoir for microorganisms. This paper reports on a small experimental study which investigated the potential role of computers in cross-infection. The results indicate that computer surfaces are similar to other environmental surfaces and carry the same risks for cross-infection.
Resumo:
Food has been a major agenda in political, socio-cultural, and environmental domains throughout history. The significance of food has been particularly highlighted in recent years with the growing public awareness of the unfolding impacts of climate change, challenging our understanding, practice, and expectations of our relationship with food. Parallel to this development has been the rise of web applications such as blogs, wikis, video and photo sharing sites, and social networking systems that are arguably more open, collaborative, and personalisable. These so-called ‘Web 2.0’ technologies have contributed to a more participatory Internet experience than what had previously been possible. An increasing number of these social applications are now available on mobile technologies where they take advantage of device-specific features such as sensors, location and context awareness, further expanding potential for the culture of participation and creativity. This international volume assembles a diverse collection of book chapters that contribute towards exploring and better understanding the opportunities and challenges provided by tools, interfaces, methods, and practices of social and mobile technology to enable engagement with people and creativity in the domain of food in contemporary society. It brings together an international group of academics and practitioners from a diverse range of disciplines such as computing and engineering, social sciences, digital media and human-computer interaction to critically examine a range of applications of social and mobile technology, such as social networking, mobile interaction, wikis, twitter, blogging, mapping, shared displays and urban screens, and their impact to foster a better understanding and practice of environmentally, socio-culturally, economically, and health-wise sustainable food culture.