951 resultados para Distributed space-time code
Resumo:
The Hf isotope composition of seawater does not match that expected from dissolution of bulk continental crust. This mismatch is generally considered to be due to retention of unradiogenic Hf in resistant zircons during incomplete weathering of continental crust. During periods of intense glacial weathering, zircons should break down more efficiently, resulting in the release of highly unradiogenic Hf to the oceans. We test this hypothesis by comparing Nd and Hf isotope time series obtained from NW Atlantic ferromanganese crusts. Both isotope systems show a decrease associated with the onset of northern hemisphere glaciation. The observed changes display distinct trajectories in epsilon Nd- epsilon Hf space, which differ from previously reported arrays of bulk terrestrial material and seawater. Such patterns are consistent with the release of highly unradiogenic Hf from very old zircons, facilitated by enhanced mechanical weathering.
Resumo:
This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.
Resumo:
Peer reviewed
Resumo:
Fitting statistical models is computationally challenging when the sample size or the dimension of the dataset is huge. An attractive approach for down-scaling the problem size is to first partition the dataset into subsets and then fit using distributed algorithms. The dataset can be partitioned either horizontally (in the sample space) or vertically (in the feature space), and the challenge arise in defining an algorithm with low communication, theoretical guarantees and excellent practical performance in general settings. For sample space partitioning, I propose a MEdian Selection Subset AGgregation Estimator ({\em message}) algorithm for solving these issues. The algorithm applies feature selection in parallel for each subset using regularized regression or Bayesian variable selection method, calculates the `median' feature inclusion index, estimates coefficients for the selected features in parallel for each subset, and then averages these estimates. The algorithm is simple, involves very minimal communication, scales efficiently in sample size, and has theoretical guarantees. I provide extensive experiments to show excellent performance in feature selection, estimation, prediction, and computation time relative to usual competitors.
While sample space partitioning is useful in handling datasets with large sample size, feature space partitioning is more effective when the data dimension is high. Existing methods for partitioning features, however, are either vulnerable to high correlations or inefficient in reducing the model dimension. In the thesis, I propose a new embarrassingly parallel framework named {\em DECO} for distributed variable selection and parameter estimation. In {\em DECO}, variables are first partitioned and allocated to m distributed workers. The decorrelated subset data within each worker are then fitted via any algorithm designed for high-dimensional problems. We show that by incorporating the decorrelation step, DECO can achieve consistent variable selection and parameter estimation on each subset with (almost) no assumptions. In addition, the convergence rate is nearly minimax optimal for both sparse and weakly sparse models and does NOT depend on the partition number m. Extensive numerical experiments are provided to illustrate the performance of the new framework.
For datasets with both large sample sizes and high dimensionality, I propose a new "divided-and-conquer" framework {\em DEME} (DECO-message) by leveraging both the {\em DECO} and the {\em message} algorithm. The new framework first partitions the dataset in the sample space into row cubes using {\em message} and then partition the feature space of the cubes using {\em DECO}. This procedure is equivalent to partitioning the original data matrix into multiple small blocks, each with a feasible size that can be stored and fitted in a computer in parallel. The results are then synthezied via the {\em DECO} and {\em message} algorithm in a reverse order to produce the final output. The whole framework is extremely scalable.
Resumo:
In Model-Driven Engineering (MDE), the developer creates a model using a language such as Unified Modeling Language (UML) or UML for Real-Time (UML-RT) and uses tools such as Papyrus or Papyrus-RT that generate code for them based on the model they create. Tracing allows developers to get insights such as which events occur and timing information into their own application as it runs. We try to add monitoring capabilities using Linux Trace Toolkit: next generation (LTTng) to models created in UML-RT using Papyrus-RT. The implementation requires changing the code generator to add tracing statements for the events that the user wants to monitor to the generated code. We also change the makefile to automate the build process and we create an Extensible Markup Language (XML) file that allows developers to view their traces visually using Trace Compass, an Eclipse-based trace viewing tool. Finally, we validate our results using three models we create and trace.
Resumo:
This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.
Resumo:
This paper discusses the importance of space in today’s space driven world, the current space activities of Turkey, its space organizations with legislation background information and calls for the necessity for the establishment of the Turkish Space Agency (TSA). Firstly, the importance of space is given which is followed by a brief background and current space activities in Turkey. Then, the answers to why Turkey needs a National Space Agency are outlined by stating its expected role and duties. Additionally, the framework for space policy for Turkey is proposed and the findings are compared with other developing regional space actors. Lastly, it is proposed and demonstrated that Turkey is on the right track with its space policy and it is suggested that the establishment of the TSA is critical both for a coherent space policy and progress as well as the successful development of its national space industry, security and international space relations.
Resumo:
Over the last decade, ocean sunfish movements have been monitored worldwide using various satellite tracking methods. This study reports the near-real time monitoring of fine-scale (< 10 m) behaviour of sunfish. The study was conducted in southern Portugal in May 2014 and involved satellite tags and underwater and surface robotic vehicles to measure both the movements and the contextual environment of the fish. A total of four individuals were tracked using custom-made GPS satellite tags providing geolocation estimates of fine-scale resolution. These accurate positions further informed sunfish areas of restricted search (ARS), which were directly correlated to steep thermal frontal zones. Simultaneously, and for two different occasions, an Autonomous Underwater Vehicle (AUV) video-recorded the path of the tracked fish and detected buoyant particles in the water column. Importantly, the densities of these particles were also directly correlated to steep thermal gradients. Thus, both sunfish foraging behaviour (ARS) and possibly prey densities, were found to be influenced by analogous environmental conditions. In addition, the dynamic structure of the water transited by the tracked individuals was described by a Lagrangian modelling approach. The model informed the distribution of zooplankton in the region, both horizontally and in the water column, and the resultant simulated densities positively correlated with sunfish ARS behaviour estimator (rs = 0.184, p<0.001). The model also revealed that tracked fish opportunistically displace with respect to subsurface current flow. Thus, we show how physical forcing and current structure provide a rationale for a predator’s fine-scale behaviour observed over a two weeks in May 2014.
Resumo:
Over the last decade, ocean sunfish movements have been monitored worldwide using various satellite tracking methods. This study reports the near-real time monitoring of fine-scale (< 10 m) behaviour of sunfish. The study was conducted in southern Portugal in May 2014 and involved satellite tags and underwater and surface robotic vehicles to measure both the movements and the contextual environment of the fish. A total of four individuals were tracked using custom-made GPS satellite tags providing geolocation estimates of fine-scale resolution. These accurate positions further informed sunfish areas of restricted search (ARS), which were directly correlated to steep thermal frontal zones. Simultaneously, and for two different occasions, an Autonomous Underwater Vehicle (AUV) video-recorded the path of the tracked fish and detected buoyant particles in the water column. Importantly, the densities of these particles were also directly correlated to steep thermal gradients. Thus, both sunfish foraging behaviour (ARS) and possibly prey densities, were found to be influenced by analogous environmental conditions. In addition, the dynamic structure of the water transited by the tracked individuals was described by a Lagrangian modelling approach. The model informed the distribution of zooplankton in the region, both horizontally and in the water column, and the resultant simulated densities positively correlated with sunfish ARS behaviour estimator (rs = 0.184, p<0.001). The model also revealed that tracked fish opportunistically displace with respect to subsurface current flow. Thus, we show how physical forcing and current structure provide a rationale for a predator’s fine-scale behaviour observed over a two weeks in May 2014.
Resumo:
Abstract not available
Resumo:
Unstructured mesh based codes for the modelling of continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Such codes have the potential to provide a high performance on parallel platforms for a small investment in programming. The critical parameters for success are to minimise changes to the code to allow for maintenance while providing high parallel efficiency, scalability to large numbers of processors and portability to a wide range of platforms. The paradigm of domain decomposition with message passing has for some time been demonstrated to provide a high level of efficiency, scalability and portability across shared and distributed memory systems without the need to re-author the code into a new language. This paper addresses these issues in the parallelisation of a complex three dimensional unstructured mesh Finite Volume multiphysics code and discusses the implications of automating the parallelisation process.
Resumo:
This paper discusses the urban consumer culture in Moscow and Petersburg during the 1880s and 1890s and uses the consumption of bicycles and watches as a lens through which to explore changing perceptions of time and space within the experience of modernity at the end of the nineteenth century. Specifically, I argue that the way in which consumers and merchants constructed a dialogue of meaning around particular objects; the way in which objects are consumed by a culture gives insight into the values, morals, and tenure of that culture. The paper preferences newspaper ads and photographs as the mouthpieces of merchants and consumers respectively as they constructed a dialogue in the language of consumerism, and explores the ways in which both parties sought to assign meaning to objects during the experience of modernity. I am particularly interested in the way consumers perform elements of cultural modernity in photographs and how these instances of performance relate to their negotiation of modernity. The paper takes as its focus large section of the urban Russian population, much of whom can traditionally be called “middle class” but whose diversity has led me to the adoption of the term “consumer community,” and whose makeup is described in detail. The paper contributes to the continuing scholarly discourse on the makeup of the middle class in Russia and the social boundaries of late tsarist society. It speaks to the the developing sensibilities and values of a generation struggling to define itself in a rapidly changing world, to the ways in which conceptualizations of public and private space, as well as feminine and masculine space were redefined, and to the developing visual culture of the Russian consumer society, largely predicated on the display of objects to signify socially desirable traits. Whereas other explorations of consumer culture and advertisements have portrayed the relationship between merchants and consumers as a one-sided monologue in which merchants convince consumers that certain objects have cultural value, I emphasis the dialogue between merchants and consumers, and their mutual negotiation of cultural meaning through objects.
Resumo:
The purpose of this paper is twofold. Firstly it presents a preliminary and ethnomethodologically-informed analysis of the way in which the growing structure of a particular program's code was ongoingly derived from its earliest stages. This was motivated by an interest in how the detailed structure of completed program `emerged from nothing' as a product of the concrete practices of the programmer within the framework afforded by the language. The analysis is broken down into three sections that discuss: the beginnings of the program's structure; the incremental development of structure; and finally the code productions that constitute the structure and the importance of the programmer's stock of knowledge. The discussion attempts to understand and describe the emerging structure of code rather than focus on generating `requirements' for supporting the production of that structure. Due to time and space constraints, however, only a relatively cursory examination of these features was possible. Secondly the paper presents some thoughts on the difficulties associated with the analytic---in particular ethnographic---study of code, drawing on general problems as well as issues arising from the difficulties and failings encountered as part of the analysis presented in the first section.
Resumo:
Restoration of natural wetlands may be informed by macroinvertebrate community composition. Macroinvertebrate communities of wetlands are influenced by environmental characteristics such as vegetation, soil, hydrology, land use, and isolation. This dissertation explores multiple approaches to the assessment of wetland macroinvertebrate community composition, and demonstrates how these approaches can provide complementary insights into the community ecology of aquatic macroinvertebrates. Specifically, this work focuses on macroinvertebrates of Delmarva Bays, isolated seasonal wetlands found on Maryland’s eastern shore. A comparison of macroinvertebrate community change over a nine years in a restored wetland complex indicated that the macroinvertebrate community of a rehabilitated wetlands more rapidly approximated the community of a reference site than did a newly created wetland. The recovery of a natural macroinvertebrate community in the rehabilitated wetland indicated that wetland rehabilitation should be prioritized over wetland creation and long-term monitoring may be needed to evaluate restoration success. This study also indicated that characteristics of wetland vegetation reflected community composition. The connection between wetland vegetation and macroinvertebrate community composition led to a regional assessment of predaceous diving beetle (Coleoptera: Dytiscidae) community composition in 20 seasonal wetlands, half with and half without sphagnum moss (Sphagnum spp.). Species-level identifications indicated that wetlands with sphagnum support unique and diverse assemblages of beetles. These patterns suggest that sphagnum wetlands provide habitat that supports biodiversity on the Delmarva Peninsula. To compare traits of co-occurring beetles, mandible morphology and temporal and spatial variation were measured between three species of predaceous diving beetles. Based on mandible architecture, all species may consume similarly sized prey, but prey characteristics likely differ in terms of piercing force required for successful capture and consumption. Therefore, different assemblages of aquatic beetles may have different effects on macroinvertebrate community structure. Integrating community-level and species-level data strengthens the association between individual organisms and their ecological role. Effective restoration of imperiled wetlands benefits from this integration, as it informs the management practices that both preserve biodiversity and promote ecosystem services.
Resumo:
The surface of the Earth is subjected to vertical deformations caused by geophysical and geological processes which can be monitored by Global Positioning System (GPS) observations. The purpose of this work is to investigate GPS height time series to identify interannual signals affecting the Earth’s surface over the European and Mediterranean area, during the period 2001-2019. Thirty-six homogeneously distributed GPS stations were selected from the online dataset made available by the Nevada Geodetic Laboratory (NGL) on the basis of the length and quality of the data series. The Principal Component Analysis (PCA) is the technique applied to extract the main patterns of the space and time variability of the GPS Up coordinate. The time series were studied by means of a frequency analysis using a periodogram and the real-valued Morlet wavelet. The periodogram is used to identify the dominant frequencies and the spectral density of the investigated signals; the second one is applied to identify the signals in the time domain and the relevant periodicities. This study has identified, over European and Mediterranean area, the presence of interannual non-linear signals with a period of 2-to-4 years, possibly related to atmospheric and hydrological loading displacements and to climate phenomena, such as El Niño Southern Oscillation (ENSO). A clear signal with a period of about six years is present in the vertical component of the GPS time series, likely explainable by the gravitational coupling between the Earth’s mantle and the inner core. Moreover, signals with a period in the order of 8-9 years, might be explained by mantle-inner core gravity coupling and the cycle of the lunar perigee, and a signal of 18.6 years, likely associated to lunar nodal cycle, were identified through the wavelet spectrum. However, these last two signals need further confirmation because the present length of the GPS time series is still too short when compared to the periods involved.