994 resultados para Hardware Solver
Resumo:
Modelos matemáticos têm sido utilizados para representar a distribuição dos valores de lâmina de água aplicada em uma área irrigada, informação fundamental na avaliação do desempenho de sistemas de irrigação. Apesar dos avanços, ainda não existe um modelo universalmente aceito para a descrição da distribuição dos valores de água desses sistemas. Os objetivos deste trabalho foram propor um modelo matemático para a avaliação do desempenho de sistemas de irrigação e desenvolver um fator de adequação para o cálculo da lâmina bruta a ser aplicada que agregue, em um único indicador, as medidas de uniformidade e de eficiência de aplicação de água da irrigação. Os parâmetros de ajuste do modelo proposto foram determinados por meio da rotina Solver da planilha Excel, e os indicadores de desempenho da irrigação, calculados por meio de expressões matemáticas deduzidas para uso do modelo proposto. Utilizando dados de desempenho da irrigação de um pivô-central, verificou-se que o modelo é apropriado para a análise de desempenho da irrigação e para obtenção do fator de adequação da irrigação desenvolvido, ao englobar indicadores de desempenho necessários à avaliação do sistema, simplificar os procedimentos de análise e permitir o cálculo direto da lâmina de água requerida para irrigação.
Resumo:
Often, road construction causes the need to create a work zone. In these scenarios, portable concrete barriers (PCBs) are typically installed to shield workers and equipment from errant vehicles as well as prevent motorists from striking other roadside hazards. For an existing W-beam guardrail system installed adjacent to the roadway and near the work zone, guardrail sections are removed in order to place the portable concrete barrier system. The focus of this research study was to develop a proper stiffness transition between W-beam guardrail and portable concrete barrier systems. This research effort was accomplished through development and refinement of design concepts using computer simulation with LS-DYNA. Several design concepts were simulated, and design metrics were used to evaluate and refine each concept. These concepts were then analyzed and ranked based on feasibility, likelihood of success, and ease of installation. The rankings were presented to the Technical Advisory Committee (TAC) for selection of a preferred design alternative. Next, a Critical Impact Point (CIP) study was conducted, while additional analyses were performed to determine the critical attachment location and a reduced installation length for the portable concrete barriers. Finally, an additional simulation effort was conducted in order to evaluate the safety performance of the transition system under reverse-direction impact scenarios as well as to select the CIP. Recommendations were also provided for conducting a Phase II study and evaluating the nested Midwest Guardrail System (MGS) configuration using three Test Level 3 (TL-3) full-scale crash tests according to the criteria provided in the Manual for Assessing Safety Hardware, as published by the American Association of Safety Highway and Transportation Officials (AASHTO).
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Polynomial constraint solving plays a prominent role in several areas of hardware and software analysis and verification, e.g., termination proving, program invariant generation and hybrid system verification, to name a few. In this paper we propose a new method for solving non-linear constraints based on encoding the problem into an SMT problem considering only linear arithmetic. Unlike other existing methods, our method focuses on proving satisfiability of the constraints rather than on proving unsatisfiability, which is more relevant in several applications as we illustrate with several examples. Nevertheless, we also present new techniques based on the analysis of unsatisfiable cores that allow one to efficiently prove unsatisfiability too for a broad class of problems. The power of our approach is demonstrated by means of extensive experiments comparing our prototype with state-of-the-art tools on benchmarks taken both from the academic and the industrial world.
Resumo:
The coupling between topography, waves and currents in the surf zone may selforganize to produce the formation of shore-transverse or shore-oblique sand bars on an otherwise alongshore uniform beach. In the absence of shore-parallel bars, this has been shown by previous studies of linear stability analysis, but is now extended to the finite-amplitude regime. To this end, a nonlinear model coupling wave transformation and breaking, a shallow-water equations solver, sediment transport and bed updating is developed. The sediment flux consists of a stirring factor multiplied by the depthaveraged current plus a downslope correction. It is found that the cross-shore profile of the ratio of stirring factor to water depth together with the wave incidence angle primarily determine the shape and the type of bars, either transverse or oblique to the shore. In the latter case, they can open an acute angle against the current (upcurrent oriented) or with the current (down-current oriented). At the initial stages of development, both the intensity of the instability which is responsible for the formation of the bars and the damping due to downslope transport grow at a similar rate with bar amplitude, the former being somewhat stronger. As bars keep on growing, their finite-amplitude shape either enhances downslope transport or weakens the instability mechanism so that an equilibrium between both opposing tendencies occurs, leading to a final saturated amplitude. The overall shape of the saturated bars in plan view is similar to that of the small-amplitude ones. However, the final spacings may be up to a factor of 2 larger and final celerities can also be about a factor of 2 smaller or larger. In the case of alongshore migrating bars, the asymmetry of the longshore sections, the lee being steeper than the stoss, is well reproduced. Complex dynamics with merging and splitting of individual bars sometimes occur. Finally, in the case of shore-normal incidence the rip currents in the troughs between the bars are jet-like while the onshore return flow is wider and weaker as is observed in nature.
Resumo:
Measuring the height of the vertical jump is an indicator of the strength and power of the lower body. The technological tools available to measure the vertical jump are black boxes and are not open to third-party verification or adaptation. We propose the creation of a measurement system called Chronojump-Boscosystem, consisting of open hardware and free software. Methods: A microcontroller was created and validated using a square wave generator and an oscilloscope. Two types of contact platforms were developed using different materials. These platforms were validated by the minimum pressure required for activation at different points by a strain gauge, together with the on/off time of our platforms in respect of the Ergojump-Boscosystem platform by a sample of 8 subjects performing submaximal jumps with one foot on each platform. Agile methodologies were used to develop and validate the software. Results: All the tools fall under the free software / open hardware guidelines and are, in that sense, free. The microcontroller margin of error is 0.1%. The validity of the fiberglass platform is 0.95 (ICC). The management software contains nearly 113.000 lines of code and is available in 7 languages.
Resumo:
The complex structural organization of the white matter of the brain can be depicted in vivo in great detail with advanced diffusion magnetic resonance (MR) imaging schemes. Diffusion MR imaging techniques are increasingly varied, from the simplest and most commonly used technique-the mapping of apparent diffusion coefficient values-to the more complex, such as diffusion tensor imaging, q-ball imaging, diffusion spectrum imaging, and tractography. The type of structural information obtained differs according to the technique used. To fully understand how diffusion MR imaging works, it is helpful to be familiar with the physical principles of water diffusion in the brain and the conceptual basis of each imaging technique. Knowledge of the technique-specific requirements with regard to hardware and acquisition time, as well as the advantages, limitations, and potential interpretation pitfalls of each technique, is especially useful.
Resumo:
Generalized Born methods are currently among the solvation models most commonly used for biological applications. We reformulate the generalized Born molecular volume method initially described by (Lee et al, 2003, J Phys Chem, 116, 10606; Lee et al, 2003, J Comp Chem, 24, 1348) using fast Fourier transform convolution integrals. Changes in the initial method are discussed and analyzed. Finally, the method is extensively checked with snapshots from common molecular modeling applications: binding free energy computations and docking. Biologically relevant test systems are chosen, including 855-36091 atoms. It is clearly demonstrated that, precision-wise, the proposed method performs as good as the original, and could better benefit from hardware accelerated boards.
Resumo:
Recent reports indicate that of the over 25,000 bridges in Iowa, slightly over 7,000 (29%) are either structurally deficient or functionally obsolete. While many of these bridges may be strengthened or rehabilitated, some simply need to be replaced. Before implementing one of these options, one should consider performing a diagnostic load test on the structure to more accurately assess its load carrying capacity. Frequently, diagnostic load tests reveal strength and serviceability characteristics that exceed the predicted codified parameters. Usually, codified parameters are very conservative in predicting lateral load distribution characteristics and the influence of other structural attributes. As a result, the predicted rating factors are typically conservative. In cases where theoretical calculations show a structural deficiency, it may be very beneficial to apply a "tool" that utilizes a more accurate theoretical model which incorporates field-test data. At a minimum, this approach results in more accurate load ratings and many times results in increased rating factors. Bridge Diagnostics, Inc. (BDI) developed hardware and software that are specially designed for performing bridge ratings based on data obtained from physical testing. To evaluate the BDI system, the research team performed diagnostic load tests on seven "typical" bridge structures: three steel-girder bridges with concrete decks, two concrete slab bridges, and two steel-girder bridges with timber decks. In addition, a steel-girder bridge with a concrete deck previously tested and modeled by BDI was investigated for model verification purposes. The tests were performed by attaching strain transducers on the bridges at critical locations to measure strains resulting from truck loading positioned at various locations on the bridge. The field test results were used to develop and validate analytical rating models. Based on the experimental and analytical results, it was determined that bridge tests could be conducted relatively easy, that accurate models could be generated with the BDI software, and that the load ratings, in general, were greater than the ratings, obtained using the codified LFD Method (according to AASHTO Standard Specifications for Highway Bridges).
Resumo:
A good system of preventive bridge maintenance enhances the ability of engineers to manage and monitor bridge conditions, and take proper action at the right time. Traditionally infrastructure inspection is performed via infrequent periodical visual inspection in the field. Wireless sensor technology provides an alternative cost-effective approach for constant monitoring of infrastructures. Scientific data-acquisition systems make reliable structural measurements, even in inaccessible and harsh environments by using wireless sensors. With advances in sensor technology and availability of low cost integrated circuits, a wireless monitoring sensor network has been considered to be the new generation technology for structural health monitoring. The main goal of this project was to implement a wireless sensor network for monitoring the behavior and integrity of highway bridges. At the core of the system is a low-cost, low power wireless strain sensor node whose hardware design is optimized for structural monitoring applications. The key components of the systems are the control unit, sensors, software and communication capability. The extensive information developed for each of these areas has been used to design the system. The performance and reliability of the proposed wireless monitoring system is validated on a 34 feet span composite beam in slab bridge in Black Hawk County, Iowa. The micro strain data is successfully extracted from output-only response collected by the wireless monitoring system. The energy efficiency of the system was investigated to estimate the battery lifetime of the wireless sensor nodes. This report also documents system design, the method used for data acquisition, and system validation and field testing. Recommendations on further implementation of wireless sensor networks for long term monitoring are provided.
Resumo:
The goal of this work was to move structural health monitoring (SHM) one step closer to being ready for mainstream use by the Iowa Department of Transportation (DOT) Office of Bridges and Structures. To meet this goal, the objective of this project was to implement a pilot multi-sensor continuous monitoring system on the Iowa Falls Arch Bridge such that autonomous data analysis, storage, and retrieval can be demonstrated. The challenge with this work was to develop the open channels for communication, coordination, and cooperation of various Iowa DOT offices that could make use of the data. In a way, the end product was to be something akin to a control system that would allow for real-time evaluation of the operational condition of a monitored bridge. Development and finalization of general hardware and software components for a bridge SHM system were investigated and completed. This development and finalization was framed around the demonstration installation on the Iowa Falls Arch Bridge. The hardware system focused on using off-the-shelf sensors that could be read in either “fast” or “slow” modes depending on the desired monitoring metric. As hoped, the installed system operated with very few problems. In terms of communications—in part due to the anticipated installation on the I-74 bridge over the Mississippi River—a hardline digital subscriber line (DSL) internet connection and grid power were used. During operation, this system would transmit data to a central server location where the data would be processed and then archived for future retrieval and use. The pilot monitoring system was developed for general performance evaluation purposes (construction, structural, environmental, etc.) such that it could be easily adapted to the Iowa DOT’s bridges and other monitoring needs. The system was developed allowing easy access to near real-time data in a format usable to Iowa DOT engineers.
Resumo:
Following the success of the first round table in 2001, the Swiss Proteomic Society has organized two additional specific events during its last two meetings: a proteomic application exercise in 2002 and a round table in 2003. Such events have as their main objective to bring together, around a challenging topic in mass spectrometry, two groups of specialists, those who develop and commercialize mass spectrometry equipment and software, and expert MS users for peptidomics and proteomics studies. The first round table (Geneva, 2001) entitled "Challenges in Mass Spectrometry" was supported by brief oral presentations that stressed critical questions in the field of MS development or applications (Stöcklin and Binz, Proteomics 2002, 2, 825-827). Topics such as (i) direct analysis of complex biological samples, (ii) status and perspectives for MS investigations of noncovalent peptide-ligant interactions; (iii) is it more appropriate to have complementary instruments rather than a universal equipment, (iv) standardization and improvement of the MS signals for protein identification, (v) what would be the new generation of equipment and finally (vi) how to keep hardware and software adapted to MS up-to-date and accessible to all. For the SPS'02 meeting (Lausanne, 2002), a full session alternative event "Proteomic Application Exercise" was proposed. Two different samples were prepared and sent to the different participants: 100 micro g of snake venom (a complex mixture of peptides and proteins) and 10-20 micro g of almost pure recombinant polypeptide derived from the shrimp Penaeus vannamei carrying an heterogeneous post-translational modification (PTM). Among the 15 participants that received the samples blind, eight returned results and most of them were asked to present their results emphasizing the strategy, the manpower and the instrumentation used during the congress (Binz et. al., Proteomics 2003, 3, 1562-1566). It appeared that for the snake venom extract, the quality of the results was not particularly dependant on the strategy used, as all approaches allowed Lication of identification of a certain number of protein families. The genus of the snake was identified in most cases, but the species was ambiguous. Surprisingly, the precise identification of the recombinant almost pure polypeptides appeared to be much more complicated than expected as only one group reported the full sequence. Finally the SPS'03 meeting reported here included a round table on the difficult and challenging task of "Quantification by Mass Spectrometry", a discussion sustained by four selected oral presentations on the use of stable isotopes, electrospray ionization versus matrix-assisted laser desorption/ionization approaches to quantify peptides and proteins in biological fluids, the handling of differential two-dimensional liquid chromatography tandem mass spectrometry data resulting from high throughput experiments, and the quantitative analysis of PTMs. During these three events at the SPS meetings, the impressive quality and quantity of exchanges between the developers and providers of mass spectrometry equipment and software, expert users and the audience, were a key element for the success of these fruitful events and will have definitively paved the way for future round tables and challenging exercises at SPS meetings.
Resumo:
MRI has evolved into an important diagnostic technique in medical imaging. However, reliability of the derived diagnosis can be degraded by artifacts, which challenge both radiologists and automatic computer-aided diagnosis. This work proposes a fully-automatic method for measuring image quality of three-dimensional (3D) structural MRI. Quality measures are derived by analyzing the air background of magnitude images and are capable of detecting image degradation from several sources, including bulk motion, residual magnetization from incomplete spoiling, blurring, and ghosting. The method has been validated on 749 3D T(1)-weighted 1.5T and 3T head scans acquired at 36 Alzheimer's Disease Neuroimaging Initiative (ADNI) study sites operating with various software and hardware combinations. Results are compared against qualitative grades assigned by the ADNI quality control center (taken as the reference standard). The derived quality indices are independent of the MRI system used and agree with the reference standard quality ratings with high sensitivity and specificity (>85%). The proposed procedures for quality assessment could be of great value for both research and routine clinical imaging. It could greatly improve workflow through its ability to rule out the need for a repeat scan while the patient is still in the magnet bore.
Resumo:
Introduction: Building online courses is a highly time consuming task for teachers of a single university. Universities working alone create high-quality courses but often cannot cover all pathological fields. Moreover this often leads to duplication of contents among universities, representing a big waste of teacher time and energy. We initiated in 2011 a French university network for building mutualized online teaching pathology cases, and this network has been extended in 2012 to Quebec and Switzerland. Method: Twenty French universities (see & for details), University Laval in Quebec and University of Lausanne in Switzerland are associated to this project. One e-learning Moodle platform (http://moodle.sorbonne-paris-cite.fr/) contains texts with URL pointing toward virtual slides that are decentralized in several universities. Each university has the responsibility of its own slide scanning, slide storage and online display with virtual slide viewers. The Moodle website is hosted by PRES Sorbonne Paris Cité, and financial supports for hardware have been obtained from UNF3S (http://www.unf3s.org/) and from PRES Sorbonne Paris Cité. Financial support for international fellowships has been obtained from CFQCU (http://www.cfqcu.org/). Results: The Moodle interface has been explained to pathology teachers using web-based conferences with screen sharing. The teachers added then contents such as clinical cases, selfevaluations and other media organized in several sections by student levels and pathological fields. Contents can be used as online learning or online preparation of subsequent courses in classrooms. In autumn 2013, one resident from Quebec spent 6 weeks in France and Switzerland and created original contents in inflammatory skin pathology. These contents are currently being validated by senior teachers and will be opened to pathology residents in spring 2014. All contents of the website can be accessed for free. Most contents just require anonymous connection but some specific fields, especially those containing pictures obtained from patients who agreed for a teaching use only, require personal identification of the students. Also, students have to register to access Moodle tests. All contents are written in French but one case has been translated into English to illustrate this communication (http://moodle.sorbonne-pariscite.fr/mod/page/view.php?id=261) (use "login as a guest"). The Moodle test module allows many types of shared questions, making it easy to create personalized tests. Contents that are opened to students have been validated by an editorial committee composed of colleagues from the participating institutions. Conclusions: Future developments include other international fellowships, the next one being scheduled for one French resident from May to October 2014 in Quebec, with a study program centered on lung and breast pathology. It must be kept in mind that these e-learning programs highly depend on teachers' time, not only at these early steps but also later to update the contents. We believe that funding resident fellowships for developing online pathological teaching contents is a win-win situation, highly beneficial for the resident who will improve his knowledge and way of thinking, highly beneficial for the teachers who will less worry about access rights or image formats, and finally highly beneficial for the students who will get courses fully adapted to their practice.
Resumo:
Remote control systems are a very useful element to control and monitor devices quickly and easily. This paper proposes a new architecture for remote control of Android mobile devices, analyzing the different alternatives and seeking the optimal solution in each case. Although the area of remote control, in case of mobile devices, has been little explored, it may provide important advantages for testing software and hardware developments in several real devices. It can also allow an efficient management of various devices of different types, perform forensic security tasks, etc ... The main idea behind the proposed architecture was the design of a system to be used as a platform which provides the services needed to perform remote control of mobile devices. As a result of this research, a proof of concept was implemented. An Android application running a group of server programs on the device, connected to the network or USB interface, depending on availability. This servers can be controlled through a small client written in Java and runnable both on desktop and web systems.