214 resultados para SDN,NFV,Cloud Native,Kubernetes,5GCore
Resumo:
Robotic multiwell planar patch-clamp has become common in drug development and safety programs because it enables efficient and systematic testing of compounds against ion channels during voltage-clamp. It has not, however, been adopted significantly in other important areas of ion channel research, where conventional patch-clamp remains the favored method. Here, we show the wider potential of the multiwell approach with the ability for efficient intracellular solution exchange, describing protocols and success rates for recording from a range of native and primary mammalian cells derived from blood vessels, arthritic joints and the immune and central nervous systems. The protocol involves preparing a suspension of single cells to be dispensed robotically into 4-8 microfluidic chambers each containing a glass chip with a small aperture. Under automated control, giga-seals and whole-cell access are achieved followed by preprogrammed routines of voltage paradigms and fast extracellular or intracellular solution exchange. Recording from 48 chambers usually takes 1-6 h depending on the experimental design and yields 16-33 cell recordings.
Resumo:
This article argues that a native-speaker baseline is a neglected dimension of studies into second language (L2) performance. If we investigate how learners perform language tasks, we should distinguish what performance features are due to their processing an L2 and which are due to their performing a particular task. Having defined what we mean by “native speaker,” we present the background to a research study into task features on nonnative task performance, designed to include native-speaker data as a baseline for interpreting nonnative-speaker performance. The nonnative results, published in this journal (Tavakoli & Foster, 2008) are recapitulated and then the native-speaker results are presented and discussed in the light of them. The study is guided by the assumption that limited attentional resources impact on L2 performance and explores how narrative design features—namely complexity of storyline and tightness of narrative structure— affect complexity, fluency, accuracy, and lexical diversity in language. The results show that both native and nonnative speakers are prompted by storyline complexity to use more subordinated language, but narrative structure had different effects on native and nonnative fluency. The learners, who were based in either London or Tehran, did not differ in their performance when compared to each other, except in lexical diversity, where the learners in London were close to native-speaker levels. The implications of the results for the applicability of Levelt’s model of speaking to an L2 are discussed, as is the potential for further L2 research using native speakers as a baseline.
Resumo:
This paper reports on a comparative study of pauses made by L2 learners and native speakers of English while narrating picture stories. The comparison is based on the number of pauses and total amount of silence in themiddle and at the end of clauses in the performance of 40 native speakers and 40 L2 learners of English. The results of the quantitative analyses suggest that, although the L2 learners generally pausemore repeatedly and have longer periods of silence than the native speakers, the distinctive feature of their pausing pattern is that they pause frequently in the middle of clauses rather than at the end. The qualitative analysis of the data suggests that some of the L2 learners’mid-clause pauses are associated with processes such as replacement, reformulation, and online planning. Formulaic sequences, however, contain very few pauses and therefore appear to facilitate the learners’ fluency.
Resumo:
Layer clouds are globally extensive. Their lower edges are charged negatively by the fair weather atmospheric electricity current flowing vertically through them. Using polar winter surface meteorological data from Sodankyla ̈ (Finland) and Halley (Antarctica), we find that when meteorological diurnal variations are weak, an appreciable diurnal cycle, on average, persists in the cloud base heights, detected using a laser ceilometer. The diurnal cloud base heights from both sites correlate more closely with the Carnegie curve of global atmospheric electricity than with local meteorological measurements. The cloud base sensitivities are indistinguishable between the northern and southern hemispheres, averaging a (4.0 ± 0.5) m rise for a 1% change in the fair weather electric current density. This suggests that the global fair weather current, which is affected by space weather, cosmic rays and the El Nin ̃o Southern Oscillation, is linked with layer cloud properties.
Resumo:
Monthly averaged surface erythemal solar irradiance (UV-Ery) for local noon from 1960 to 2100 has been derived using radiative transfer calculations and projections of ozone, temperature and cloud change from 14 chemistry climate models (CCM), as part of the CCMVal-2 activity of SPARC. Our calculations show the influence of ozone depletion and recovery on erythemal irradiance. In addition, we investigate UV-Ery changes caused by climate change due to increasing greenhouse gas concentrations. The latter include effects of both stratospheric ozone and cloud changes. The derived estimates provide a global picture of the likely changes in erythemal irradiance during the 21st century. Uncertainties arise from the assumed scenarios, different parameterizations – particularly of cloud effects on UV-Ery – and the spread in the CCM projections. The calculations suggest that relative to 1980, annually mean UV-Ery in the 2090s will be on average 12% lower at high latitudes in both hemispheres, 3% lower at mid latitudes, and marginally higher (1 %) in the tropics. The largest reduction (16 %) is projected for Antarctica in October. Cloud effects are responsible for 2–3% of the reduction in UV-Ery at high latitudes, but they slightly moderate it at mid-latitudes (1 %). The year of return of erythemal irradiance to values of certain milestones (1965 and 1980) depends largely on the return of column ozone to the corresponding levels and is associated with large uncertainties mainly due to the spread of the model projections. The inclusion of cloud effects in the calculations has only a small effect of the return years. At mid and high latitudes, changes in clouds and stratospheric ozone transport by global circulation changes due to greenhouse gases will sustain the erythemal irradiance at levels below those in 1965, despite the removal of ozone depleting substances.
Resumo:
An El Niño-like steady response is found in a greenhouse warming simulation resulting from coupled ocean-atmosphere dynamical feedbacks similar to those producing the present-day El Niños. There is a strong negative cloud-radiation feedback on the sea surface temperature (SST) anomaly associated with this enhanced eastern equatorial Pacific warm pattern. However, this negative feedback is overwhelmed by the positive dynamical feedbacks and cannot diminish the sensitivity of the tropical SST to enhanced greenhouse gas concentrations. The enhanced eastern-Pacific warming in the coupled ocean-atmosphere system suggests that coupled dynamics can strengthen this sensitivity.
Resumo:
We analyze here the polar stratospheric temperatures in an ensemble of three 150-year integrations of the Canadian Middle Atmosphere Model (CMAM), an interactive chemistry-climate model which simulates ozone depletion and recovery, as well as climate change. A key motivation is to understand possible mechanisms for the observed trend in the extent of conditions favourable for polar stratospheric cloud (PSC) formation in the Arctic winter lower stratosphere. We find that in the Antarctic winter lower stratosphere, the low temperature extremes required for PSC formation increase in the model as ozone is depleted, but remain steady through the twenty-first century as the warming from ozone recovery roughly balances the cooling from climate change. Thus, ozone depletion itself plays a major role in the Antarctic trends in low temperature extremes. The model trend in low temperature extremes in the Arctic through the latter half of the twentieth century is weaker and less statistically robust than the observed trend. It is not projected to continue into the future. Ozone depletion in the Arctic is weaker in the CMAM than in observations, which may account for the weak past trend in low temperature extremes. In the future, radiative cooling in the Arctic winter due to climate change is more than compensated by an increase in dynamically driven downwelling over the pole.
Resumo:
Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.
Resumo:
This article focuses on the characteristics of persistent thin single-layer mixed-phase clouds. We seek to answer two important questions: (i) how does ice continually nucleate and precipitate from these clouds, without the available ice nuclei becoming depleted? (ii) how do the supercooled liquid droplets persist in spite of the net flux of water vapour to the growing ice crystals? These questions are answered quantitatively using in situ and radar observations of a long-lived mixed-phase cloud layer over the Chilbolton Observatory. Doppler radar measurements show that the top 500 m of cloud (the top 250 m of which is mixed-phase, with ice virga beneath) is turbulent and well-mixed, and the liquid water content is adiabatic. This well-mixed layer is bounded above and below by stable layers. This inhibits entrainment of fresh ice nuclei into the cloud layer, yet our in situ and radar observations show that a steady flux of ≈100 m−2s−1 ice crystals fell from the cloud over the course of ∼1 day. Comparing this flux to the concentration of conventional ice nuclei expected to be present within the well-mixed layer, we find that these nuclei would be depleted within less than 1 h. We therefore argue that nucleation in these persistent supercooled clouds is strongly time-dependent in nature, with droplets freezing slowly over many hours, significantly longer than the few seconds residence time of an ice nucleus counter. Once nucleated, the ice crystals are observed to grow primarily by vapour deposition, because of the low liquid water path (21 g m−2) yet vapour-rich environment. Evidence for this comes from high differential reflectivity in the radar observations, and in situ imaging of the crystals. The flux of vapour from liquid to ice is quantified from in situ measurements, and we show that this modest flux (3.3 g m−2h−1) can be readily offset by slow radiative cooling of the layer to space.
Resumo:
The goal of this article is to make an epistemological and theoretical contribution to the nascent field of third language (L3) acquisition and show how examining L3 development can offer a unique view into longstanding debates within L2 acquisition theory. We offer the Phonological Permeability Hypothesis (PPH), which maintains that examining the development of an L3/Ln phonological system and its effects on a previously acquired L2 phonological system can inform contemporary debates regarding the mental constitution of postcritical period adult phonological acquisition. We discuss the predictions and functional significance of the PPH for adult SLA and multilingualism studies, detailing a methodology that examines the effects of acquiring Brazilian Portuguese on the Spanish phonological systems learned before and after the so-called critical period (i.e., comparing simultaneous versus successive adult English-Spanish bilinguals learning Brazilian Portuguese as an L3).
Resumo:
Within generative L2 acquisition research there is a longstanding debate as to what underlies observable differences in L1/L2 knowledge/ performance. On the one hand, Full Accessibility approaches maintain that target L2 syntactic representations (new functional categories and features) are acquirable (e.g., Schwartz & Sprouse, 1996). Conversely, Partial Accessibility approaches claim that L2 variability and/or optionality, even at advanced levels, obtains as a result of inevitable deficits in L2 narrow syntax and is conditioned upon a maturational failure in adulthood to acquire (some) new functional features (e.g., Beck, 1998; Hawkins & Chan, 1997; Hawkins & Hattori, 2006; Tsimpli & Dimitrakopoulou, 2007). The present study tests the predictions of these two sets of approaches with advanced English learners of L2 Brazilian Portuguese (n = 21) in the domain of inflected infinitives. These advanced L2 learners reliably differentiate syntactically between finite verbs, uninflected and inflected infinitives, which, as argued, only supports Full Accessibility approaches. Moreover, we will discuss how testing the domain of inflected infinitives is especially interesting in light of recent proposals that Brazilian Portuguese colloquial dialects no longer actively instantiate them (Lightfoot, 1991; Pires, 2002, 2006; Pires & Rothman, 2009; Rothman, 2007).
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
Native enzymes play a significant role in proteolysis of milk during storage. This is significant for heat resistant native enzymes. Plasmin is one of the most heat resistant enzymes found in milk. It has been reported to survive several heat treatments, causing spoilage during storage. The aim of this study was to assess susceptibility of high temperature heated milk to proteolysis by native enzymes. The trinitrobenzene sulphonic acid (TNBS) method was used for this purpose. Raw milk was heated at 110, 120, 130,142°C for 2 s and 85°C for 15 s and milk processed at low temperature (85°C /15s) was selected to mimic pasteurisation. TNBS method confirmed that raw milk and milk processed at 85°C /15s were the most proteolysed, whereas treatment of milk at high temperatures (110, 120, 130 and 142°C for 2 s) inactivated the native enzymes. It may thus be concluded that high temperature processing positively affects proteolysis by lowering its susceptibility to spoilage during storage.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.