876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security
Resumo:
We report findings from psycholinguistic experiments investigating the detailed timing of processing morphologically complex words by proficient adult second (L2) language learners of English in comparison to adult native (L1) speakers of English. The first study employed the masked priming technique to investigate -ed forms with a group of advanced Arabic-speaking learners of English. The results replicate previously found L1/L2 differences in morphological priming, even though in the present experiment an extra temporal delay was offered after the presentation of the prime words. The second study examined the timing of constraints against inflected forms inside derived words in English using the eye-movement monitoring technique and an additional acceptability judgment task with highly advanced Dutch L2 learners of English in comparison to adult L1 English controls. Whilst offline the L2 learners performed native-like, the eye-movement data showed that their online processing was not affected by the morphological constraint against regular plurals inside derived words in the same way as in native speakers. Taken together, these findings indicate that L2 learners are not just slower than native speakers in processing morphologically complex words, but that the L2 comprehension system employs real-time grammatical analysis (in this case, morphological information) less than the L1 system.
Resumo:
We present projections of winter storm-induced insured losses in the German residential building sector for the 21st century. With this aim, two structurally most independent downscaling methods and one hybrid downscaling method are applied to a 3-member ensemble of ECHAM5/MPI-OM1 A1B scenario simulations. One method uses dynamical downscaling of intense winter storm events in the global model, and a transfer function to relate regional wind speeds to losses. The second method is based on a reshuffling of present day weather situations and sequences taking into account the change of their frequencies according to the linear temperature trends of the global runs. The third method uses statistical-dynamical downscaling, considering frequency changes of the occurrence of storm-prone weather patterns, and translation into loss by using empirical statistical distributions. The A1B scenario ensemble was downscaled by all three methods until 2070, and by the (statistical-) dynamical methods until 2100. Furthermore, all methods assume a constant statistical relationship between meteorology and insured losses and no developments other than climate change, such as in constructions or claims management. The study utilizes data provided by the German Insurance Association encompassing 24 years and with district-scale resolution. Compared to 1971–2000, the downscaling methods indicate an increase of 10-year return values (i.e. loss ratios per return period) of 6–35 % for 2011–2040, of 20–30 % for 2041–2070, and of 40–55 % for 2071–2100, respectively. Convolving various sources of uncertainty in one confidence statement (data-, loss model-, storm realization-, and Pareto fit-uncertainty), the return-level confidence interval for a return period of 15 years expands by more than a factor of two. Finally, we suggest how practitioners can deal with alternative scenarios or possible natural excursions of observed losses.
Resumo:
In late February 2010 the extraordinary windstorm Xynthia crossed over Southwestern and Central Europe and caused severe damage, affecting particularly the Spanish and French Atlantic coasts. The storm was embedded in uncommon large-scale atmospheric and boundary conditions prior to and during its development, namely enhanced sea surface temperatures (SST) within the low-level entrainment zone of air masses, an unusual southerly position of the polar jet stream, and a remarkable split jet structure in the upper troposphere. To analyse the processes that led to the rapid intensification of this exceptional storm originating close to the subtropics (30°N), the sensitivity of the cyclone intensification to latent heat release is determined using the regional climate model COSMO-CLM forced with ERA-Interim data. A control simulation with observed SST shows that moist and warm air masses originating from the subtropical North Atlantic were involved in the cyclogenesis process and led to the formation of a vertical tower with high values of potential vorticity (PV). Sensitivity studies with reduced SST or increased laminar boundary roughness for heat led to reduced surface latent heat fluxes. This induced both a weaker and partly retarded development of the cyclone and a weakening of the PV-tower together with reduced diabatic heating rates, particularly at lower and mid levels. We infer that diabatic processes played a crucial role during the phase of rapid deepening of Xynthia and thus to its intensity over the Southeastern North Atlantic. We suggest that windstorms like Xynthia may occur more frequently under future climate conditions due to the warming SSTs and potentially enhanced latent heat release, thus increasing the windstorm risk for Southwestern Europe.
Resumo:
JASMIN is a super-data-cluster designed to provide a high-performance high-volume data analysis environment for the UK environmental science community. Thus far JASMIN has been used primarily by the atmospheric science and earth observation communities, both to support their direct scientific workflow, and the curation of data products in the STFC Centre for Environmental Data Archival (CEDA). Initial JASMIN configuration and first experiences are reported here. Useful improvements in scientific workflow are presented. It is clear from the explosive growth in stored data and use that there was a pent up demand for a suitable big-data analysis environment. This demand is not yet satisfied, in part because JASMIN does not yet have enough compute, the storage is fully allocated, and not all software needs are met. Plans to address these constraints are introduced.
Resumo:
The use of ageostrophic flow to infer the presence of vertical circulations in the entrances and exits of the climatological jet streams is questioned. Problems of interpretation arise because of the use of different definitions of geostrophy in theoretical studies and in analyses of atmospheric data. The nature and role of the ageostrophic flow based on constant and variable Coriolis parameter definitions of geostrophy vary. In the latter the geostrophic divergence cannot be neglected, so the vertical motion is not associated solely with the ageostrophic flow. Evidence is presented suggesting that ageostrophic flow in the climatological jet streams is primarily determined by the kinematic requirements of wave retrogression rather than by a forcing process. These requirements are largely met by the rotational flow, with the divergent circulations present being geostrophically forced, and so playing a secondary, restoring role.
Resumo:
Exascale systems are the next frontier in high-performance computing and are expected to deliver a performance of the order of 10^18 operations per second using massive multicore processors. Very large- and extreme-scale parallel systems pose critical algorithmic challenges, especially related to concurrency, locality and the need to avoid global communication patterns. This work investigates a novel protocol for dynamic group communication that can be used to remove the global communication requirement and to reduce the communication cost in parallel formulations of iterative data mining algorithms. The protocol is used to provide a communication-efficient parallel formulation of the k-means algorithm for cluster analysis. The approach is based on a collective communication operation for dynamic groups of processes and exploits non-uniform data distributions. Non-uniform data distributions can be either found in real-world distributed applications or induced by means of multidimensional binary search trees. The analysis of the proposed dynamic group communication protocol has shown that it does not introduce significant communication overhead. The parallel clustering algorithm has also been extended to accommodate an approximation error, which allows a further reduction of the communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing elements.
Resumo:
Streamwater nitrate dynamics in the River Hafren, Plynlimon, mid-Wales were investigated over decadal to sub-daily timescales using a range of statistical techniques. Long-term data were derived from weekly grab samples (1984–2010) and high-frequency data from 7-hourly samples (2007–2009) both measured at two sites: a headwater stream draining moorland and a downstream site below plantation forest. This study is one of the first to analyse upland streamwater nitrate dynamics across such a wide range of timescales and report on the principal mechanisms identified. The data analysis provided no clear evidence that the long-term decline in streamwater nitrate concentrations was related to a decline in atmospheric deposition alone, because nitrogen deposition first increased and then decreased during the study period. Increased streamwater temperature and denitrification may also have contributed to the decline in stream nitrate concentrations, the former through increased N uptake rates and the latter resultant from increased dissolved organic carbon concentrations. Strong seasonal cycles, with concentration minimums in the summer, were driven by seasonal flow minimums and seasonal biological activity enhancing nitrate uptake. Complex diurnal dynamics were observed, with seasonal changes in phase and amplitude of the cycling, and the diurnal dynamics were variable along the river. At the moorland site, a regular daily cycle, with minimum concentrations in the early afternoon, corresponding with peak air temperatures, indicated the importance of instream biological processing. At the downstream site, the diurnal dynamics were a composite signal, resultant from advection, dispersion and nitrate processing in the soils of the lower catchment. The diurnal streamwater nitrate dynamics were also affected by drought conditions. Enhanced diurnal cycling in Spring 2007 was attributed to increased nitrate availability in the post-drought period as well as low flow rates and high temperatures over this period. The combination of high-frequency short-term measurements and long-term monitoring provides a powerful tool for increasing understanding of the controls of element fluxes and concentrations in surface waters.
Resumo:
Predicting the future response of the Antarctic Ice Sheet to climate change requires an understanding of the ice streams that dominate its dynamics. Here we use cosmogenic isotope exposure-age dating (26Al, 10Be and 36Cl) of erratic boulders on ice-free land on James Ross Island, north-eastern Antarctic Peninsula, to define the evolution of Last Glacial Maximum (LGM) ice in the adjacent Prince Gustav Channel. These data include ice-sheet extent, thickness and dynamical behaviour. Prior to ∼18 ka, the LGM Antarctic Peninsula Ice Sheet extended to the continental shelf-edge and transported erratic boulders onto high-elevation mesas on James Ross Island. After ∼18 ka there was a period of rapid ice-sheet surface-lowering, coincident with the initiation of the Prince Gustav Ice Stream. This timing coincided with rapid increases in atmospheric temperature and eustatic sea-level rise around the Antarctic Peninsula. Collectively, these data provide evidence for a transition from a thick, cold-based LGM Antarctic Peninsula Ice Sheet to a thinner, partially warm-based ice sheet during deglaciation.
Resumo:
The ability to match individual patients to tailored treatments has the potential to greatly improve outcomes for individuals suffering from major depression. In particular, while the vast majority of antidepressant treatments affect either serotonin or noradrenaline or a combination of these two neurotransmitters, it is not known whether there are particular patients or symptom profiles which respond preferentially to the potentiation of serotonin over noradrenaline or vice versa. Experimental medicine models suggest that the primary mode of action of these treatments may be to remediate negative biases in emotional processing. Such models may provide a useful framework for interrogating the specific actions of antidepressants. Here, we therefore review evidence from studies examining the effects of drugs which potentiate serotonin, noradrenaline or a combination of both neurotransmitters on emotional processing. These results suggest that antidepressants targeting serotonin and noradrenaline may have some specific actions on emotion and reward processing which could be used to improve tailoring of treatment or to understand the effects of dual-reuptake inhibition. Specifically, serotonin may be particularly important in alleviating distress symptoms, while noradrenaline may be especially relevant to anhedonia. The data reviewed here also suggest that noradrenergic-based treatments may have earlier effects on emotional memory that those which affect serotonin.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
Background: Auditory discrimination is significantly impaired in Wernicke’s aphasia (WA) and thought to be causatively related to the language comprehension impairment which characterises the condition. This study used mismatch negativity (MMN) to investigate the neural responses corresponding to successful and impaired auditory discrimination in WA. Methods: Behavioural auditory discrimination thresholds of CVC syllables and pure tones were measured in WA (n=7) and control (n=7) participants. Threshold results were used to develop multiple-deviant mismatch negativity (MMN) oddball paradigms containing deviants which were either perceptibly or non-perceptibly different from the standard stimuli. MMN analysis investigated differences associated with group, condition and perceptibility as well as the relationship between MMN responses and comprehension (within which behavioural auditory discrimination profiles were examined). Results: MMN waveforms were observable to both perceptible and non-perceptible auditory changes. Perceptibility was only distinguished by MMN amplitude in the PT condition. The WA group could be distinguished from controls by an increase in MMN response latency to CVC stimuli change. Correlation analyses displayed relationship between behavioural CVC discrimination and MMN amplitude in the control group, where greater amplitude corresponded to better discrimination. The WA group displayed the inverse effect; both discrimination accuracy and auditory comprehension scores were reduced with increased MMN amplitude. In the WA group, a further correlation was observed between the lateralisation of MMN response and CVC discrimination accuracy; the greater the bilateral involvement the better the discrimination accuracy. Conclusions: The results from this study provide further evidence for the nature of auditory comprehension impairment in WA and indicate that the auditory discrimination deficit is grounded in a reduced ability to engage in efficient hierarchical processing and the construction of invariant auditory objects. Correlation results suggest that people with chronic WA may rely on an inefficient, noisy right hemisphere auditory stream when attempting to process speech stimuli.
Resumo:
During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series for Germany are quantified based on potential losses using empirical models. Two reanalysis data sets and observations from German weather stations are considered for 30 winters. Histograms of events exceeding selected return levels (1-, 2- and 5-year) are derived. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Over 4000 years of general circulation model (GCM) simulations forced with current climate conditions are analysed to provide a better assessment of historical return periods. Estimations differ between distributions, for example 40 to 65 years for the 1990 series. For such less frequent series, estimates obtained with the Poisson distribution clearly deviate from empirical data. The negative binomial distribution provides better estimates, even though a sensitivity to return level and data set is identified. The consideration of GCM data permits a strong reduction of uncertainties. The present results support the importance of considering explicitly clustering of losses for an adequate risk assessment for economical applications.
Resumo:
The WFDEI meteorological forcing data set has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. We discuss the specifics of how changes in the reanalysis and processing have led to improvement over the WFD. We attribute improvements in precipitation and wind speed to the latest reanalysis basis data and improved downward shortwave fluxes to the changes in the aerosol corrections. Covering 1979–2012, the WFDEI will allow more thorough comparisons of hydrological and Earth System model outputs with hydrologically and phenologically relevant satellite products than using the WFD.
Resumo:
Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. We investigate the reasons for this for one model, INCA-P, testing model output against 18 months of daily data in a small Scottish catchment. We examine key model processes and provide recommendations for model improvement and simplification. Improvements to the particulate phosphorus simulation are especially needed. The model evaluation procedure is then generalised to provide a checklist for identifying why model performance may be poor or unreliable, incorporating calibration, data, structural and conceptual challenges. There needs to be greater recognition that current models struggle to produce positive Nash–Sutcliffe statistics in agricultural catchments when evaluated against daily data. Phosphorus modelling is difficult, but models are not as useless as this might suggest. We found a combination of correlation coefficients, bias, a comparison of distributions and a visual assessment of time series a better means of identifying realistic simulations.