48 resultados para Local computer network
Resumo:
This thesis describes an investigation into a Local Authority's desire to use its airport to aid regional economic growth. Short studies on air freight. the impact of an airport on the local economy, incoming tourism. and the factors influencing airlines in their use of airports, show that this desire is valid. but only in so far as the airport enables air services to be provided. A survey of airlines. conducted to remedy some deficiencies in the documented knowledge on airline decision-making criteria. indicates that there is cause for concern about the methods used to develop air services. A comparison with the West German network suggests that Birmingham is underprovided with international scheduled flights, and reinforces the survey conclusion that an airport authority must become actively involved in the development of air services. Participation in the licence applications of two airlines to use Birmingham Airport confirms the need for involvement but without showing the extent of the influence which an airport authority may exert. The conclusion is reached that in order to fulfill its development potential, an airport must be marketed to both the general public and the air transport industry. There is also a need for a national air services plan.
Resumo:
A major focus of stem cell research is the generation of neurons that may then be implanted to treat neurodegenerative diseases. However, a picture is emerging where astrocytes are partners to neurons in sustaining and modulating brain function. We therefore investigated the functional properties of NT2 derived astrocytes and neurons using electrophysiological and calcium imaging approaches. NT2 neurons (NT2Ns) expressed sodium dependent action potentials, as well as responses to depolarisation and the neurotransmitter glutamate. NT2Ns exhibited spontaneous and coordinated calcium elevations in clusters and in extended processes, indicating local and long distance signalling. Tetrodotoxin sensitive network activity could also be evoked by electrical stimulation. Similarly, NT2 astrocytes (NT2As) exhibited morphology and functional properties consistent with this glial cell type. NT2As responded to neuronal activity and to exogenously applied neurotransmitters with calcium elevations, and in contrast to neurons, also exhibited spontaneous rhythmic calcium oscillations. NT2As also generated propagating calcium waves that were gap junction and purinergic signalling dependent. Our results show that NT2 derived astrocytes exhibit appropriate functionality and that NT2N networks interact with NT2A networks in co-culture. These findings underline the utility of such cultures to investigate human brain cell type signalling under controlled conditions. Furthermore, since stem cell derived neuron function and survival is of great importance therapeutically, our findings suggest that the presence of complementary astrocytes may be valuable in supporting stem cell derived neuronal networks. Indeed, this also supports the intriguing possibility of selective therapeutic replacement of astrocytes in diseases where these cells are either lost or lose functionality.
Resumo:
In perceptual terms, the human body is a complex 3d shape which has to be interpreted by the observer to judge its attractiveness. Both body mass and shape have been suggested as strong predictors of female attractiveness. Normally body mass and shape co-vary, and it is difficult to differentiate their separate effects. A recent study suggested that altering body mass does not modulate activity in the reward mechanisms of the brain, but shape does. However, using computer generated female body-shaped greyscale images, based on a Principal Component Analysis of female bodies, we were able to construct images which covary with real female body mass (indexed with BMI) and not with body shape (indexed with WHR), and vice versa. Twelve observers (6 male and 6 female) rated these images for attractiveness during an fMRI study. The attractiveness ratings were correlated with changes in BMI and not WHR. Our primary fMRI results demonstrated that in addition to activation in higher visual areas (such as the extrastriate body area), changing BMI also modulated activity in the caudate nucleus, and other parts of the brain reward system. This shows that BMI, not WHR, modulates reward mechanisms in the brain and we infer that this may have important implications for judgements of ideal body size in eating disordered individuals.
Resumo:
In this paper, the implementation aspects and constraints of the simplest network coding (NC) schemes for a two-way relay channel (TWRC) composed of a user equipment (mobile terminal), an LTE relay station (RS) and an LTE base station (eNB) are considered in order to assess the usefulness of the NC in more realistic scenarios. The information exchange rate gain (IERG), the energy reduction gain (ERG) and the resource utilization gain (RUG) of the NC schemes with and without subcarrier division duplexing (SDD) are obtained by computer simulations. The usefulness of the NC schemes are evaluated for varying traffic load levels, the geographical distances between the nodes, the RS transmit powers, and the maximum numbers of retransmissions. Simulation results show that the NC schemes with and without SDD, have the throughput gains 0.5% and 25%, the ERGs 7 - 12% and 16 - 25%, and the RUGs 0.5 - 3.2%, respectively. It is found that the NC can provide performance gains also for the users at the cell edge. Furthermore, the ERGs of the NC increase with the transmit power of the relay while the ERGs of the NC remain the same even when the maximum number of retransmissions is reduced.
Resumo:
We introduce a type of 2-tier convolutional neural network model for learning distributed paragraph representations for a special task (e.g. paragraph or short document level sentiment analysis and text topic categorization). We decompose the paragraph semantics into 3 cascaded constitutes: word representation, sentence composition and document composition. Specifically, we learn distributed word representations by a continuous bag-of-words model from a large unstructured text corpus. Then, using these word representations as pre-trained vectors, distributed task specific sentence representations are learned from a sentence level corpus with task-specific labels by the first tier of our model. Using these sentence representations as distributed paragraph representation vectors, distributed paragraph representations are learned from a paragraph-level corpus by the second tier of our model. It is evaluated on DBpedia ontology classification dataset and Amazon review dataset. Empirical results show the effectiveness of our proposed learning model for generating distributed paragraph representations.
Resumo:
Text classification is essential for narrowing down the number of documents relevant to a particular topic for further pursual, especially when searching through large biomedical databases. Protein-protein interactions are an example of such a topic with databases being devoted specifically to them. This paper proposed a semi-supervised learning algorithm via local learning with class priors (LL-CP) for biomedical text classification where unlabeled data points are classified in a vector space based on their proximity to labeled nodes. The algorithm has been evaluated on a corpus of biomedical documents to identify abstracts containing information about protein-protein interactions with promising results. Experimental results show that LL-CP outperforms the traditional semisupervised learning algorithms such as SVMand it also performs better than local learning without incorporating class priors.
Resumo:
Quality of services (QoS) support is critical for dedicated short range communications (DSRC) vehicle networks based collaborative road safety applications. In this paper we propose an adaptive power and message rate control method for DSRC vehicle networks at road intersections. The design objective is to provide high availability and low latency channels for high priority emergency safety applications while maximizing channel utilization for low priority routine safety applications. In this method an offline simulation based approach is used to find out the best possible configurations of transmit power and message rate for given numbers of vehicles in the network. The identified best configurations are then used online by roadside access points (AP) according to estimated number of vehicles. Simulation results show that this adaptive method significantly outperforms a fixed control method. © 2011 Springer-Verlag.
Resumo:
Artificial Immune Systems are well suited to the problem of using a profile representation of an individual’s or a group’s interests to evaluate documents. Nootropia is a user profiling model that exhibits similarities to models of the immune system that have been developed in the context of autopoietic theory. It uses a self-organising term network that can represent a user’s multiple interests and can adapt to both short-term variations and substantial changes in them. This allows Nootropia to drift, constantly following changes in the user’s multiple interests, and, thus, to become structurally coupled to the user.
Resumo:
Efficiency in the mutual fund (MF), is one of the issues that has attracted many investors in countries with advanced financial market for many years. Due to the need for frequent study of MF's efficiency in short-term periods, investors need a method that not only has high accuracy, but also high speed. Data envelopment analysis (DEA) is proven to be one of the most widely used methods in the measurement of the efficiency and productivity of decision making units (DMUs). DEA for a large dataset with many inputs/outputs would require huge computer resources in terms of memory and CPU time. This paper uses neural network back-ropagation DEA in measurement of mutual funds efficiency and shows the requirements, in the proposed method, for computer memory and CPU time are far less than that needed by conventional DEA methods and can therefore be a useful tool in measuring the efficiency of a large set of MFs. Copyright © 2014 Inderscience Enterprises Ltd.
Resumo:
Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.
Resumo:
These case studies from CIMA highlight the need to embed risk management within more easily understood behaviours, consistent with the overall organisational culture. In each case, some form of internal audit team provides either an oversight function or acts as an expert link in that feedback loop. Frontline staff, managers and specialists should be completely aligned on risk, in part just to ensure that there is a consistency of approach. They should understand instinctively that good performance includes good risk management. Tesco has continued to thrive during the recession and remains a robust and efficient group of businesses despite the emergence of potential threats around consumer spending and the supply chain. RBS, by contrast, has suffered catastrophic and very public failures of risk management despite a large in-house function and stiff regulation of risk controls. Birmingham City Council, like all local authorities, is adapting to more commercial modes of operation and is facing diverse threats and opportunities emerging as a result of social change. And DCMS, like many other public sector organisations, has to handle an incredibly complex network of delivery partners within the context of a relatively recent overhaul of central government risk management processes. Key Findings: •Risk management is no longer solely a financial discipline, nor is it simply a concern for the internal control function. •Where organisations retain a discrete risk management cadre – often specialists at monitoring and evaluating a range of risks – their success is dependent on embedding risk awareness in the wider culture of the enterprise. •Risk management is most successful when it is explicitly linked to operational performance. •Clear leadership, specific goals, excellent influencing skills and open-mindedness to potential threats and opportunities are essential for effective risk management. •Bureaucratic processes and systems can hamper good risk management – either as a result of a ‘box-ticking mentality’ or because managers and staff believe they do not need to consider risk themselves.
Resumo:
Epilepsy is one of the most common neurological disorders, a large fraction of which is resistant to pharmacotherapy. In this light, understanding the mechanisms of epilepsy and its intractable forms in particular could create new targets for pharmacotherapeutic intervention. The current project explores the dynamic changes in neuronal network function in the chronic temporal lobe epilepsy (TLE) in rat and human brain in vitro. I focused on the process of establishment of epilepsy (epileptogenesis) in the temporal lobe. Rhythmic behaviour of the hippocampal neuronal networks in healthy animals was explored using spontaneous oscillations in the gamma frequency band (SγO). The use of an improved brain slice preparation technique resulted in the natural occurence (in the absence of pharmacological stimulation) of rhythmic activity, which was then pharmacologically characterised and compared to other models of gamma oscillations (KA- and CCh-induced oscillations) using local field potential recording technique. The results showed that SγO differed from pharmacologically driven models, suggesting higher physiological relevance of SγO. Network activity was also explored in the medial entorhinal cortex (mEC), where spontaneous slow wave oscillations (SWO) were detected. To investigate the course of chronic TLE establishment, a refined Li-pilocarpine-based model of epilepsy (RISE) was developed. The model significantly reduced animal mortality and demonstrated reduced intensity, yet high morbidy with almost 70% mean success rate of developing spontaneous recurrent seizures. We used SγO to characterize changes in the hippocampal neuronal networks throughout the epileptogenesis. The results showed that the network remained largely intact, demonstrating the subtle nature of the RISE model. Despite this, a reduction in network activity was detected during the so-called latent (no seizure) period, which was hypothesized to occur due to network fragmentation and an abnormal function of kainate receptors (KAr). We therefore explored the function of KAr by challenging SγO with kainic acid (KA). The results demonstrated a remarkable decrease in KAr response during the latent period, suggesting KAr dysfunction or altered expression, which will be further investigated using a variety of electrophysiological and immunocytochemical methods. The entorhinal cortex, together with the hippocampus, is known to play an important role in the TLE. Considering this, we investigated neuronal network function of the mEC during epileptogenesis using SWO. The results demonstrated a striking difference in AMPAr function, with possible receptor upregulation or abnormal composition in the early development of epilepsy. Alterations in receptor function inevitably lead to changes in the network function, which may play an important role in the development of epilepsy. Preliminary investigations were made using slices of human brain tissue taken following surgery for intratctable epilepsy. Initial results showed that oscillogenesis could be induced in human brain slices and that such network activity was pharmacologically similar to that observed in rodent brain. Overall, our findings suggest that excitatory glutamatergic transmission is heavily involved in the process of epileptogenesis. Together with other types of receptors, KAr and AMPAr contribute to epilepsy establishment and may be the key to uncovering its mechanism.
Resumo:
Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.
Resumo:
Purpose – The purpose of this paper is to explore the importance of host country networks and organisation of production in the context of international technology transfer that accompanies foreign direct investment (FDI). Design/methodology/approach – The empirical analysis is based on unbalanced panel data covering Japanese firms active in two-digit manufacturing sectors over a seven-year period. Given the self-selection problem affecting past sectoral-level studies, using firm-level panel data is a prerequisite to provide robust empirical evidence. Findings – While Japan is thought of as being a technologically advanced country, the results show that vertical productivity spillovers from FDI occur in Japan, but they are sensitive to technological differences between domestic firms and the idiosyncratic Japanese institutional network. FDI in vertically organised keiretsu sectors generates inter-industry spillovers through backward and forward linkages, while FDI within sectors linked to vertical keiretsu activities adversely affects domestic productivity. Overall, our results suggest that the role of vertical keiretsu is more prevalent than that of horizontal keiretsu. Originality/value – Japan’s industrial landscape has been dominated by institutional clusters or networks of inter-firm organisations through reciprocated, direct and indirect ties. However, interactions between inward investors and such institutionalised networks in the host economy are seldom explored. The role and characteristics of local business groups, in the form of keiretsu networks, have been investigated to determine the scale and scope of spillovers from inward FDI to Japanese establishments. This conceptualisation depends on the institutional mechanism and the market structure through which host economies absorb and exploit FDI.
Resumo:
Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.