926 resultados para Praial and estuarine system.
Resumo:
The development of the distributed information measurement and control system for optical spectral research of particle beam and plasma objects and the execution of laboratory works on Physics and Engineering Department of Petrozavodsk State University are described. At the hardware level the system is represented by a complex of the automated workplaces joined into computer network. The key element of the system is the communication server, which supports the multi-user mode and distributes resources among clients, monitors the system and provides secure access. Other system components are formed by equipment servers (CАМАC and GPIB servers, a server for the access to microcontrollers MCS-196 and others) and the client programs that carry out data acquisition, accumulation and processing and management of the course of the experiment as well. In this work the designed by the authors network interface is discussed. The interface provides the connection of measuring and executive devices to the distributed information measurement and control system via Ethernet. This interface allows controlling of experimental parameters by use of digital devices, monitoring of experiment parameters by polling of analog and digital sensors. The device firmware is written in assembler language and includes libraries for Ethernet-, IP-, TCP- и UDP-packets forming.
Resumo:
The present paper is devoted to creation of cryptographic data security and realization of the packet mode in the distributed information measurement and control system that implements methods of optical spectroscopy for plasma physics research and atomic collisions. This system gives a remote access to information and instrument resources within the Intranet/Internet networks. The system provides remote access to information and hardware resources for the natural sciences within the Intranet/Internet networks. The access to physical equipment is realized through the standard interface servers (PXI, CАМАC, and GPIB), the server providing access to Ethernet devices, and the communication server, which integrates the equipment servers into a uniform information system. The system is used to make research task in optical spectroscopy, as well as to support the process of education at the Department of Physics and Engineering of Petrozavodsk State University.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
2000 Mathematics Subject Classification: Primary 47A48, 93B28, 47A65; Secondary 34C94.
Resumo:
Firms worldwide are taking major initiatives to reduce the carbon footprint of their supply chains in response to the growing governmental and consumer pressures. In real life, these supply chains face stochastic and non-stationary demand but most of the studies on inventory lot-sizing problem with emission concerns consider deterministic demand. In this paper, we study the inventory lot-sizing problem under non-stationary stochastic demand condition with emission and cycle service level constraints considering carbon cap-and-trade regulatory mechanism. Using a mixed integer linear programming model, this paper aims to investigate the effects of emission parameters, product- and system-related features on the supply chain performance through extensive computational experiments to cover general type business settings and not a specific scenario. Results show that cycle service level and demand coefficient of variation have significant impacts on total cost and emission irrespective of level of demand variability while the impact of product's demand pattern is significant only at lower level of demand variability. Finally, results also show that increasing value of carbon price reduces total cost, total emission and total inventory and the scope of emission reduction by increasing carbon price is greater at higher levels of cycle service level and demand coefficient of variation. The analysis of results helps supply chain managers to take right decision in different demand and service level situations.
Resumo:
In view of the increasingly complexity of services logic and functional requirements, a new system architecture based on SOA was proposed for the equipment remote monitoring and diagnosis system. According to the design principles of SOA, different levels and different granularities of services logic and functional requirements for remote monitoring and diagnosis system were divided, and a loosely coupled web services system was built. The design and implementation schedule of core function modules for the proposed architecture were presented. A demo system was used to validate the feasibility of the proposed architecture.
Resumo:
Az adócsalásnak egy olyan modellcsaládját vizsgáljuk, ahol az egykulcsos adó kizárólag a közjavakat finanszírozza. Két megközelítés összehasonlítására összpontosítunk. Az elsőben minden dolgozó jövedelme azonos, és ebből minden évben annyit vall be, amennyi maximalizálja a nála maradó jövedelemből fedezhető fogyasztás nyújtotta hasznosság és a jövedelembevallásból fakadó hasznosság összegét. A második hasznosság három tényező szorzata: a dolgozó exogén adómorálja, a környezetében előző évben megfigyelt átlagos jövedelembevallás és saját bevallásából fakadó endogén hasznossága. A második megközelítésben az ágensek egyszerű heurisztikus szabályok szerint cselekszenek. Míg az optimalizáló modellben hagyományos Laffer-görbékkel találkozunk, addig a heurisztikán alapuló modellekben (lineárisan) növekvő Laffer-görbék jönnek létre. E különbség oka, hogy a heurisztikán alapuló modellben egy sajátos viselkedésfajta jelentkezik: számos ágens ingatag helyzetbe kerül, amelyben altruizmus és önzés között ingadozik. ________ The authors study a family of models of tax evasion, where a flat-rate tax only finances the provision of public goods and audits and wage differences are ne-glected. The paper focuses on comparing two modelling approaches. The first is based on optimizing agents, endowed with social preferences, their utility being the sum of private consumption and moral utility. The second approach involves agents acting according to simple heuristics. While the traditionally shaped Laffer curves are encountered in the optimizing model, the heuristics models exhibit (linearly) increasing Laffer curves. This difference is related to a peculiar type of behaviour: within the agent-based approach lurk a number of agents in a moral state of limbo, alternating between altruism and selfishness.
Resumo:
The commercialization of inventions is very complex and challenging therefore it requires the collaboration of several actors in an economy. Even when an invention possesses significant added value, its successful commercialization could only be executed in a stable macroeconomic and innovation environment and also if proper innovation management expertise is provided. ValDeal Innovations Zrt. was established to foster the commercialization of Hungarian, high business potential inventions by providing its business expertise. The company used an – already in various markets and countries probed – US innovation management method consisting of the tasks of technology evaluation as well as the commercialization of inventions. There were major changes necessary while probing the US method residing in the different macroeconomic circumstances and the attitudes for innovation in Hungary. The article details the above mentioned issues together with the conclusions the members of ValDeal have drawn during the innovation management process.
Resumo:
This study investigated the effects of augmented prenatal auditory stimulation on postnatal visual responsivity and neural organization in bobwhite quail (Colinus virginianus). I delivered conspecific embryonic vocalizations before, during, or after the development of a multisensory, midbrain audiovisual area, the optic tectum. Postnatal simultaneous choice tests revealed that hatchlings receiving augmented auditory stimulation during optic tectum development as embryos failed to show species-typical visual preferences for a conspecific maternal hen 72 hours after hatching. Auditory simultaneous choice tests showed no hatchlings had deficits in auditory function in any of the groups, indicating deficits were specific to visual function. ZENK protein expression confirmed differences in the amount of neural plasticity in multiple neuroanatomical regions of birds receiving stimulation during optic tecturn development, compared to unmanipulated birds. The results of these experiments support the notion that the timing of augmented prenatal auditory stimulation relative to optic tectum development can impact postnatal perceptual organization in an enduring way.^
Resumo:
The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.
Resumo:
This study demonstrates the compositional heterogeneity of a protein-like fluorescence emission signal (T-peak; excitation/emission maximum at 280/325 nm) of dissolved organic matter (DOM) samples collected from subtropical river and estuarine environments. Natural water samples were collected from the Florida Coastal Everglades ecosystem. The samples were ultrafiltered and excitation–emission fluorescence matrices were obtained. The T-peak intensity correlated positively with N concentration of the ultrafiltered DOM solution (UDON), although, the low correlation coefficient (r2=0.140, p<0.05) suggested the coexistence of proteins with other classes of compounds in the T-peak. As such, the T-peak was unbundled on size exclusion chromatography. The elution curves showed that the T-peak was composed of two compounds with distinct molecular weights (MW) with nominal MWs of about >5×104 (T1) and ∼7.6×103 (T2) and with varying relative abundance among samples. The T1-peak intensity correlated strongly with [UDON] (r2=0.516, p<0.001), while T2-peak did not, which suggested that the T-peak is composed of a mixture of compounds with different chemical structures and ecological roles, namely proteinaceous materials and presumably phenolic moieties in humic-like substances. Natural source of the latter may include polyphenols leached from senescent plant materials, which are important precursors of humic substances. This idea is supported by the fact that polyphenols, such as gallic acid, an important constituent of hydrolysable tannins, and condensed tannins extracted from red mangrove (Rhizophora mangle) leaves exhibited the fluorescence peak in the close vicinity of the T-peak (260/346 and 275/313 nm, respectively). Based on this study the application of the T-peak as a proxy for [DON] in natural waters may have limitations in coastal zones with significant terrestrial DOM input.
Resumo:
The assessment of organic matter (OM) sources in sediments and soils is a key to better understand the biogeochemical cycling of carbon in aquatic environments. While traditional molecular marker-based methods have provided such information for typical two end member (allochthonous/terrestrial vs. autochthonous/microbial)-dominated systems, more detailed, biomass-specific assessments are needed for ecosystems with complex OM inputs such as tropical and sub-tropical wetlands and estuaries where aquatic macrophytes and macroalgae may play an important role as OM sources. The aim of this study was to assess the utility of a combined approach using compound specific stable carbon isotope analysis and an n-alkane based proxy (Paq) to differentiate submerged and emergent/terrestrial vegetation OM inputs to soils/sediments from a sub-tropical wetland and estuarine system, the Florida Coastal Everglades. Results show that Paq values (0.13–0.51) for the emergent/terrestrial plants were generally lower than those for freshwater/marine submerged vegetation (0.45–1.00) and that compound specific δ13C values for the n-alkanes (C23 to C31) were distinctively different for terrestrial/emergent and freshwater/marine submerged plants. While crossplots of the Paq and n-alkane stable isotope values for the C23n-alkane suggest that OM inputs are controlled by vegetation changes along the freshwater to marine transect, further resolution regarding OM input changes along this landscape was obtained through principal component analysis (PCA), successfully grouping the study sites according to the OM source strengths. The data show the potential for this n-alkane based multi-proxy approach as a means of assessing OM inputs to complex ecosystems.
Resumo:
The author explores the challenges graduate students face preparing for a dissertation through university events. Autoethnography using notes, observation, and journal writing and framed by genre and activity system theory highlight university system conflicts with the culture of the student.
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.