935 resultados para International Institute for Applied Systems Analysis
Resumo:
The authors compare the performance of two types of controllers one based on the multilayered network and the other based on the single layered CMAC network (cerebellar model articulator controller). The neurons (information processing units) in the multi-layered network use Gaussian activation functions. The control scheme which is considered is a predictive control algorithm, along the lines used by Willis et al. (1991), Kambhampati and Warwick (1991). The process selected as a test bed is a continuous stirred tank reactor. The reaction taking place is an irreversible exothermic reaction in a constant volume reactor cooled by a single coolant stream. This reactor is a simplified version of the first tank in the two tank system given by Henson and Seborg (1989).
Resumo:
The authors consider the problem of a robot manipulator operating in a noisy workspace. The manipulator is required to move from an initial position P(i) to a final position P(f). P(i) is assumed to be completely defined. However, P(f) is obtained by a sensing operation and is assumed to be fixed but unknown. The authors approach to this problem involves the use of three learning algorithms, the discretized linear reward-penalty (DLR-P) automaton, the linear reward-penalty (LR-P) automaton and a nonlinear reinforcement scheme. An automaton is placed at each joint of the robot and by acting as a decision maker, plans the trajectory based on noisy measurements of P(f).
Resumo:
Aim: A nested case-control discovery study was undertaken 10 test whether information within the serum peptidome can improve on the utility of CA125 for early ovarian cancer detection. Materials and Methods: High-throughput matrix-assisted laser desorption ionisation mass spectrometry (MALDI-MS) was used to profile 295 serum samples from women pre-dating their ovarian cancer diagnosis and from 585 matched control samples. Classification rules incorporating CA125 and MS peak intensities were tested for discriminating ability. Results: Two peaks were found which in combination with CA125 discriminated cases from controls up to 15 and 11 months before diagnosis, respectively, and earlier than using CA125 alone. One peak was identified as connective tissue-activating peptide III (CTAPIII), whilst the other was putatively identified as platelet factor 4 (PF4). ELISA data supported the down-regulation of PF4 in early cancer cases. Conclusion: Serum peptide information with CA125 improves lead time for early detection of ovarian cancer. The candidate markers are platelet-derived chemokines, suggesting a link between platelet function and tumour development.
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
This paper proposes a nonlinear regression structure comprising a wavelet network and a linear term. The introduction of the linear term is aimed at providing a more parsimonious interpolation in high-dimensional spaces when the modelling samples are sparse. A constructive procedure for building such structures, termed linear-wavelet networks, is described. For illustration, the proposed procedure is employed in the framework of dynamic system identification. In an example involving a simulated fermentation process, it is shown that a linear-wavelet network yields a smaller approximation error when compared with a wavelet network with the same number of regressors. The proposed technique is also applied to the identification of a pressure plant from experimental data. In this case, the results show that the introduction of wavelets considerably improves the prediction ability of a linear model. Standard errors on the estimated model coefficients are also calculated to assess the numerical conditioning of the identification process.
Resumo:
In recent years, life event approach has been widely used by governments all over the world for designing and providing web services to citizens through their e-government portals. Despite the wide usage of this approach, there is still a challenge of how to use this approach to design e-government portals in order to automatically provide personalised services to citizens. We propose a conceptual framework for e-government service provision based on life event approach and the use of citizen profile to capture the citizen needs, since the process of finding Web services from a government-to-citizen (G2C) system involves understanding the citizens’ needs and demands, selecting the relevant services, and delivering services that matches the requirements. The proposed framework that incorporates the citizen profile is based on three components that complement each other, namely, anticipatory life events, non-anticipatory life events and recurring services.
Resumo:
“La questione di Trieste”, ovvero la questione del confine italo-yugoslavo all’indomani della seconda guerra mondiale costituisce da lungo tempo oggetto di attenzione e di esame da parte della storiografia italiana e straniera. Con alcune importanti eccezioni, la ricostruzione complessiva di quelle vicende ha visto il più delle volte il prevalere di un approccio storico-diplomatico che ha reso difficile comprendere con chiarezza i rapporti e le interdipendenze fra contesto locale, contesto nazionale e contesto internazionale. Attraverso la lettura incrociata dell’ampia documentazione proveniente dai fondi dei National Archives Records Administration (NARA) questo studio tenta una rilettura delle varie fasi di sviluppo della questione nel periodo compreso tra il giugno del 1945 e l’ottobre del 1954 secondo una duplice prospettiva: nella prima parte si concentra sulla politica americana a Trieste, guardando nello specifico a due aspetti interni tra loro strettamente correlati, la gestione dell’ordine pubblico e la “strategia” del consenso da realizzarsi mediante il controllo dell’informazione da un lato e la promozione di una politica culturale dall’altro. Sono aspetti entrambi riconducibili al modello del direct rule, che conferiva al governo militare alleato (GMA) piena ed esclusiva autorità di governo sulla zona A della Venezia Giulia, e che ci appaiono centrali anche per cogliere l’interazione fra istituzioni e soggetti sociali. Nella seconda parte, invece, il modificarsi della fonte d’archivio indica un cambiamento di priorità nella politica estera americana relativa a Trieste: a margine dei negoziati internazionali, i documenti del fondo Clare Boothe Luce nelle carte dell’Ambasciata mostrano soprattutto come la questione di Trieste venne proiettata verso l’esterno, verso l’Italia in particolare, e sfruttata – principalmente dall’ambasciatrice – nell’ottica bipolare della guerra fredda per rinforzare il sostegno interno alla politica atlantica. Il saggio, dunque, si sviluppa lungo due linee: dentro e fuori Trieste, dentro 1945-1952, fuori 1953-1954, perché dalle fonti consultate sono queste ad emergere come aree di priorità nei due periodi. Abstract - English The “Trieste question”, or the question regarding the Italian - Yugoslav border after the Second World War, has been the object of careful examination in both Italian and foreign historiography for a long time. With a few important exceptions, the overall reconstruction of these events has been based for the most part on historic and diplomatic approaches, which have sometimes made it rather difficult to understand clearly the relationships and interdependences at play between local, national and international contexts. Through a comparative analysis of a large body of documents from the National Archives and Records Administration (NARA), College Park MD, this essay attempts a second reading of the various phases in which the question developed between June 1945 and October 1954, following a twofold perspective: the first part focuses on American policy for Trieste, specifically looking at two internal and closely linked aspects, on the one hand, the management of ‘law and order’, as well as a ‘strategy’ of consent, to be achieved through the control of all the means of information , and, on the other, the promotion of a cultural policy. Both aspects can be traced back to the ‘direct rule’ model, which gave the Allied Military Government (AMG) full and exclusive governing authority over Venezia Giulia’s Zone A. These issues are also fundamental to a better understanding of the relationships between institutions and social subjects. In the second part of the essay , the change in archival sources clearly indicates a new set of priorities in American foreign policy regarding Trieste: outside any international negotiations for the settlement of the question, the Clare Boothe Luce papers held in the Embassy’s archives, show how the Trieste question was focused on external concerns, Italy in particular, and exploited – above all by the ambassador – within the bi-polar optic of the Cold War, in order to strengthen internal support for Atlantic policies. The essay therefore follows two main lines of inquiry: within and outside Trieste, within in 1945-1952, and outside 1953-1954, since, from the archival sources used, these emerge as priority areas in the two periods.
Resumo:
One central question in the formal linguistic study of adult multilingual morphosyntax (i.e., L3/Ln acquisition) involves determining the role(s) the L1 and/or the L2 play(s) at the L3 initial state (e.g., Bardel & Falk, Second Language Research 23: 459–484, 2007; Falk & Bardel, Second Language Research: forthcoming; Flynn et al., The International Journal of Multilingualism 8: 3–16, 2004; Rothman, Second Language Research: forthcoming; Rothman & Cabrelli, On the initial state of L3 (Ln) acquisition: Selective or absolute transfer?: 2007; Rothman & Cabrelli Amaro, Second Language Research 26: 219–289, 2010). The present article adds to this general program, testing Rothman's (Second Language Research: forthcoming) model for L3 initial state transfer, which when relevant in light of specific language pairings, maintains that typological proximity between the languages is the most deterministic variable determining the selection of syntactic transfer. Herein, I present empirical evidence from the later part of the beginning stages of L3 Brazilian Portuguese (BP) by native speakers of English and Spanish, who have attained an advanced level of proficiency in either English or Spanish as an L2. Examining the related domains of syntactic word order and relative clause attachment preference in L3 BP, the data clearly indicate that Spanish is transferred for both experimental groups irrespective of whether it was the L1 or L2. These results are expected by Rothman's (Second Language Research: forthcoming) model, but not necessarily predicted by other current hypotheses of multilingual syntactic transfer; the implications of this are discussed.
Resumo:
It has been argued that extended exposure to naturalistic input provides L2 learners with more of an opportunity to converge of target morphosyntactic competence as compared to classroom-only environments, given that the former provide more positive evidence of less salient linguistic properties than the latter (e.g., Isabelli 2004). Implicitly, the claim is that such exposure is needed to fully reset parameters. However, such a position conflicts with the notion of parameterization (cf. Rothman and Iverson 2007). In light of two types of competing generative theories of adult L2 acquisition – the No Impairment Hypothesis (e.g., Duffield and White 1999) and so-called Failed Features approaches (e.g., Beck 1998; Franceschina 2001; Hawkins and Chan 1997), we investigate the verifiability of such a claim. Thirty intermediate L2 Spanish learners were tested in regards to properties of the Null-Subject Parameter before and after study-abroad. The data suggest that (i) parameter resetting is possible and (ii) exposure to naturalistic input is not privileged.
Resumo:
It is now established that certain cognitive processes such as categorisation are tightly linked to the concepts encoded in language. Recent studies have shown that bilinguals with languages that differ in their concepts may show a shift in their cognition towards the L2 pattern primarily as a function of their L2 proficiency. This research has so far focused predominantly on L2 users who started learning the L2 in childhood or early puberty. The current study asks whether similar effects can be found in adult L2 learners. English speakers of L2 Japanese were given an object classification task involving real physical objects, and an online classification task involving artificial novel objects. Results showed a shift towards the L2 pattern, indicating that some degree of cognitive plasticity exists even when a second language is acquired later in life. These results have implications for theories of L2 acquisition and bilingualism, and contribute towards our understanding of the nature of the relationship between language and cognition in the L2 user’s mind.
Resumo:
Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.
Resumo:
Background: Efficacy of endocrine therapy is compromised when human breast cancer cells circumvent imposed growth inhibition. The model of long-term oestrogen-deprived MCF-7 human breast cancer cells has suggested the mechanism results from hypersensitivity to low levels of residual oestrogen. Materials and methods: MCF-7 cells were maintained for up to 30 weeks in phenol-red-free medium and charcoal-stripped serum with 10-8 M 17-oestradiol and 10 g/ml insulin (stock 1), 10-8 M 17-oestradiol (stock 2), 10 g/ml insulin (stock 3) or no addition (stock 4). Results: Loss of growth response to oestrogen was observed only in stock 4 cells. Long-term maintenance with insulin in the absence of oestradiol (stock 3) resulted in raised oestrogen receptor alpha (ERlevels (measured by western immunoblotting) and development of hypersensitivity (assayed by oestrogen-responsive reporter gene induction and dose response to oestradiol for proliferation under serum-free conditions), but with no loss of growth response to oestrogen. Conclusion: Hypersensitivity can develop without any growth adaptation and therefore is not a prerequisite for loss of growth response in MCF-7 cells.
Resumo:
Healthcare organizations are known for their complex and intense information environment. Healthcare information is facilitated via heterogeneous information systems or paper-based sources. Access to the right information under increasing time pressure is extremely challenging. This paper proposes an information architecture for healthcare organizations. It facilitates the provision of the right information to the right person in the right place and time tailored to their requirements. It adapts an abductive reasoning research approach. Organizational semiotics serves as its theoretical underpinning, guiding the data collection process through direct observation in the ophthalmology outpatient clinics of a UK hospital. It results the norm and information objects that form the information architecture. This is modeled by Archimate. The contribution of the information architecture can be seen from organizational, social and technical perspective. It clearly shows how information is facilitated within a healthcare organization, reducing duplicated data entry, and guiding the future technological implementation.
Resumo:
Three methodological limitations in English-Chinese contrastive rhetoric research have been identified in previous research, namely: the failure to control for the quality of L1 data; an inference approach to interpreting the relationship between L1 and L2 writing; and a focus on national cultural factors in interpreting rhetorical differences. Addressing these limitations, the current study examined the presence or absence and placement of thesis statement and topic sentences in four sets of argumentative texts produced by three groups of university students. We found that Chinese students tended to favour a direct/deductive approach in their English and Chinese writing, while native English writers typically adopted an indirect/inductive approach. This study argues for a dynamic and ecological interpretation of rhetorical practices in different languages and cultures.