891 resultados para TCP-friendliness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The preparation and characterisation of novel biodegradable polymer fibres for application in tissue engineering and drug delivery are reported. Poly(e-caprolactone) (PCL) fibres were produced by wet spinning from solutions in acetone under low shear (gravity flow) conditions. The tensile strength and stiffness of as-spun fibres were highly dependent on the concentration of the spinning solution. Use of a 6% w/v solution resulted in fibres having strength and stiffness of 1.8 MPa and 0.01 GPa respectively, whereas these values increased to 9.9 MPa and 0.1 GPa when fibres were produced from 20% w/v solutions. Cold drawing to an extension of 500% resulted in further increases in fibre strength (up to 50 MPa) and stiffness (0.3 GPa). Hot drawing to 500% further increased the fibre strength (up to 81 MPa) and stiffness (0.5 GPa). The surface morphology of as-spun fibres was modified, to yield a directional grooved pattern by drying in contact with a mandrel having a machined topography characterised by a peak-peak separation of 91 mm and a peak height of 30 mm. Differential scanning calorimetery (DSC) analysis of as-spun fibres revealed the characteristic melting point of PCL at around 58°C and a % crystallinity of approximately 60%. The biocompatibility of as-spun fibres was assessed using cell culture. The number of attached 3T3 Swiss mouse fibroblasts, C2C12 mouse myoblasts and human umbilical vein endothelial cells (HUVECs) on as-spun, 500% cold drawn, and gelatin coated PCL fibres were observed. The results showed that the fibres promoted cell proliferation for 9 days in cell culture and was slightly lower than on tissue culture plastic. The morphology of all cell lines was assessed on the various PCL fibres using scanning electron microscopy. The cell function of HUVECs growing on the as-spun PCL fibres was evaluated. The ability HUVECs to induce an immune response when stimulated with lipopolysaccaride (LPS) and thereby to increase the amount of cell surface receptors was assessed by flow cytometry and reverse transcription-polymerase chain reaction (RT-PCR). The results showed that PCL fibres did not inhibit this function compared to TCP. As-spun PCL fibres were loaded with 1 % ovine albumin (OVA) powder, 1% OVA nanoparticles and 5% OVA nanoparticles by weight and the protein release was assessed in vitro. PCL fibres loaded with 1 % OVA powder released 70%, 1% OVA nanoparticle released 60% and the 5% OVA nanoparticle released 25% of their protein content over 28 days. These release figures did not alter when the fibres were subjected to lipase enzymatic degradation. The OVA released was examined for structural integrity by SDS-PAGE. This showed that the protein molecular weight was not altered after incorporation into the fibres. The bioactivity of progesterone was assessed following incorporation into PCL fibres. Results showed that the progesterone released had a pronounced effect on MCF-7 breast epithelial cells, inhibiting their proliferation. The PCL fibres display high fibre compliance, a potential for controlling the fibre surface architecture to promote contact guidance effects, favorable proliferation rate of fibroblasts, myoblasts and HUVECs and the ability to release pharmaceuticals. These properties recommended their use for 3-D scaffold production in soft tissue engineering and the fibres could also be exploited for controlled presentation and release of biopharmaceuticals such as growth factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A cell culture model of the gastric epithelial cell surface would prove useful for biopharmaceutical screening of new chemical entities and dosage forms. A successful model should exhibit tight junction formation, maintenance of differentiation and polarity. Conditions for primary culture of guinea-pig gastric mucous epithelial cell monolayers on Tissue Culture Plastic (TCP) and membrane insects (Transwells) were established. Tight junction formation for cells grown on Transwells for three days was assessed by measurement of transepithelial resistance (TEER) and permeability of mannitol and fluorescein. Coating the polycarbonate filter with collagen IV, rather with collagen I, enhanced tight junction formation. TEER for cells grown on Transwells coated with collagen IV was close to that obtained with intact guinea-pig gastric epithelium in vitro. Differentiation was assessed by incorporation of [3H] glucosamine into glycoprotein and by activity of NADPH oxidase, which produces superoxide. Both of these measures were greater for cells grown on filters coated with collagen I than for cells grown on TCP, but no major difference was found between cells grown on collagens I and IV. However, monolayers grown on membranes coated with collagen IV exhibited apically polarized secretion of mucin and superoxide. The proportion of cells, which stained positively for mucin with periodic Schiff reagent, was greater than 95% for all culture conditions. Gastric epithelial monolayers grown on Transwells coated with collagen IV were able to withstand transient (30 min) apical acidification to pH 3, which was associated with a decrease in [3H] mannitol flux and an increase in TEER relative to pH 7.4. The model was used to provide the first direct demonstration that an NSAID (indomethacin) accumulated in gastric epithelial cells exposed to low apical pH. In conclusion, guinea-pig epithelial cells cultured on collagen IV represent a promising model of the gastric surface epithelium suitable for screening procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis reports a cross-national study carried out in England and India in an attempt to clarify the association of certain cultural and non-cultural characteristics with people's work-related attitudes and values, and with the structure of their work organizations. Three perspectives are considered to be relevant to the objectives of the study. The contingency perspective suggests that a 'fit' between an organization's context and its structural arrangements will be fundamentally necessary for achieving success and survival. The political economy perspective argues for the determining role of the social and economic structures within which the organization operates. The culturalist perspective looks to cultural attitudes and values of organizational members for an explanation for their organization's structure. The empirical investigation was carried out in three stages in each of the two countries involved by means of surveys of cultural attitudes, work-related attitudes and organizational structures and systems. The cultural surveys suggested that Indian and English people were different from one another with regard to fear of, and respect and obedience to, their seniors, ability to cope with ambiguity, honesty, independence, expression of emotions, fatalism, reserve, and care for others; they were similar with regard to tolerance, friendliness, attitude to change, attitude to law, self-control and self-confidence, and attitude to social differentiation. The second stage of the study, involving the employees of fourteen organizations, found that the English ones perceived themselves to have more power at work, expressed more tolerance for ambiguity, and had different expectations from their job than did the Indian equivalents. The two samples were similar with respect to commitment to their company and trust in their colleagues. The findings also suggested that employees' occupations, education and age had some influences on their work-related attitudes. The final stage of the research was a study of structures, control systems, and reward and punishment policies of the same fourteen organizations which were matched almost completely on their contextual factors across the two countries. English and Indian organizations were found to be similar in terms of centralization, specialization, chief executive's span of control, height and management control strategies. English organizations, however, were far more formalized, spent more time on consultation and their managers delegated authority lower down the hierarchy than Indian organizations. The major finding of the study was the multiple association that cultural, national and contingency factors had with the structural characteristics of the organizations and with the work-related attitudes of their members. On the basis of this finding, a multi-perspective model for understanding organizational structures and systems is proposed in which the contributions made by contingency, political economy and cultural perspectives are recognized and incorporated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contributions in this research are split in to three distinct, but related, areas. The focus of the work is based on improving the efficiency of video content distribution in the networks that are liable to packet loss, such as the Internet. Initially, the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP) is presented. Since added FEC can be used to reduce the number of retransmissions, the requirement for TCP to deal with any losses is greatly reduced. When real-time applications are needed, delay must be kept to a minimum, and retransmissions not desirable. A balance, therefore, between additional bandwidth and delays due to retransmissions must be struck. This is followed by the proposal of a hybrid transport, specifically for H.264 encoded video, as a compromise between the delay-prone TCP and the loss-prone UDP. It is argued that the playback quality at the receiver often need not be 100% perfect, providing a certain level is assured. Reliable TCP is used to transmit and guarantee delivery of the most important packets. The delay associated with the proposal is measured, and the potential for use as an alternative to the conventional methods of transporting video by either TCP or UDP alone is demonstrated. Finally, a new objective measurement is investigated for assessing the playback quality of video transported using TCP. A new metric is defined to characterise the quality of playback in terms of its continuity. Using packet traces generated from real TCP connections in a lossy environment, simulating the playback of a video is possible, whilst monitoring buffer behaviour to calculate pause intensity values. Subjective tests are conducted to verify the effectiveness of the metric introduced and show that the results of objective and subjective scores made are closely correlated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes work conducted as a joint collaboration between the Virtual Design Team (VDT) research group at Stanford University (USA) , the Systems Engineering Group (SEG) at De Montfort University (UK) and Elipsis Ltd . We describe a new docking methodology in which we combine the use of two radically different types of organizational simulation tool. The VDT simulation tool operates on a standalone computer, and employs computational agents during simulated execution of a pre-defined process model (Kunz, 1998). The other software tool, DREAMS , operates over a standard TCP/IP network, and employs human agents (real people) during a simulated execution of a pre-defined process model (Clegg, 2000).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of the distributed information measurement and control system for optical spectral research of particle beam and plasma objects and the execution of laboratory works on Physics and Engineering Department of Petrozavodsk State University are described. At the hardware level the system is represented by a complex of the automated workplaces joined into computer network. The key element of the system is the communication server, which supports the multi-user mode and distributes resources among clients, monitors the system and provides secure access. Other system components are formed by equipment servers (CАМАC and GPIB servers, a server for the access to microcontrollers MCS-196 and others) and the client programs that carry out data acquisition, accumulation and processing and management of the course of the experiment as well. In this work the designed by the authors network interface is discussed. The interface provides the connection of measuring and executive devices to the distributed information measurement and control system via Ethernet. This interface allows controlling of experimental parameters by use of digital devices, monitoring of experiment parameters by polling of analog and digital sensors. The device firmware is written in assembler language and includes libraries for Ethernet-, IP-, TCP- и UDP-packets forming.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Video streaming via Transmission Control Protocol (TCP) networks has become a popular and highly demanded service, but its quality assessment in both objective and subjective terms has not been properly addressed. In this paper, based on statistical analysis a full analytic model of a no-reference objective metric, namely pause intensity (PI), for video quality assessment is presented. The model characterizes the video playout buffer behavior in connection with the network performance (throughput) and the video playout rate. This allows for instant quality measurement and control without requiring a reference video. PI specifically addresses the need for assessing the quality issue in terms of the continuity in the playout of TCP streaming videos, which cannot be properly measured by other objective metrics such as peak signal-to-noise-ratio, structural similarity, and buffer underrun or pause frequency. The performance of the analytical model is rigidly verified by simulation results and subjective tests using a range of video clips. It is demonstrated that PI is closely correlated with viewers' opinion scores regardless of the vastly different composition of individual elements, such as pause duration and pause frequency which jointly constitute this new quality metric. It is also shown that the correlation performance of PI is consistent and content independent. © 2013 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work looks into video quality assessment applied to the field of telecare and proposes an alternative metric to the more traditionally used PSNR based on the requirements of such an application. We show that the Pause Intensity metric introduced in [1] is also relevant and applicable to heterogeneous networks with a wireless last hop connected to a wired TCP backbone. We demonstrate through our emulation testbed that the impairments experienced in such a network architecture are dominated by continuity based impairments rather than artifacts, such as motion drift or blockiness. We also look into the implication of using Pause Intensity as a metric in terms of the overall video latency, which is potentially problematic should the video be sent and acted upon in real-time. We conclude that Pause Intensity may be used alongside the video characteristics which have been suggested as a measure of the overall video quality. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we deal with video streams over TCP networks and propose an alternative measurement to the widely used and accepted peak signal to noise ratio (PSNR) due to the limitations of this metric in the presence of temporal errors. A test-bed was created to simulate buffer under-run in scalable video streams and the pauses produced as a result of the buffer under-run were inserted into the video before being employed as the subject of subjective testing. The pause intensity metric proposed in [1] was compared with the subjective results and it was shown that in spite of reductions in frame rate and resolution, a correlation with pause intensity still exists. Due to these conclusions, the metric may be employed in layer selection in scalable video streams. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hurricanes, earthquakes, floods, and other serious natural hazards have been attributed with causing changes in regional economic growth, income, employment, and wealth. Natural disasters are said to cause; (1) an acceleration of existing economic trends; (2) an expansion of employment and income, due to recovery operations (the so-called silver lining); and (3) an alteration in the structure of regional economic activity due to changes in "intra" and "inter" regional trading patterns, and technological change.^ Theoretical and stylized disaster simulations (Cochrane 1975; Haas, Cochrane, and Kates 1977; Petak et al. 1982; Ellson et al. 1983, 1984; Boisvert 1992; Brookshire and McKee 1992) point towards a wide scope of possible negative and long lasting impacts upon economic activity and structure. This work examines the consequences of Hurricane Andrew on Dade County's economy. Following the work of Ellson et al. (1984), Guimaraes et al. (1993), and West and Lenze (1993; 1994), a regional econometric forecasting model (DCEFM) using a framework of "with" and "without" the hurricane is constructed and utilized to assess Hurricane Andrew's impact on the structure and level of economic activity in Dade County, Florida.^ The results of the simulation exercises show that the direct economic impact associated with Hurricane Andrew on Dade County is of short duration, and of isolated sectoral impact, with impact generally limited to construction, TCP (transportation, communications, and public utilities), and agricultural sectors. Regional growth, and changes in income and employment reacted directly to, and within the range and direction set by national economic activity. The simulations also lead to the conclusion that areal extent, infrastructure, and sector specific damages or impacts, as opposed to monetary losses, are the primary determinants of a disaster's effects upon employment, income, growth, and economic structure. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, the internet has grown exponentially, and become more complex. This increased complexity potentially introduces more network-level instability. But for any end-to-end internet connection, maintaining the connection's throughput and reliability at a certain level is very important. This is because it can directly affect the connection's normal operation. Therefore, a challenging research task is to improve a network's connection performance by optimizing its throughput and reliability. This dissertation proposed an efficient and reliable transport layer protocol (called concurrent TCP (cTCP)), an extension of the current TCP protocol, to optimize end-to-end connection throughput and enhance end-to-end connection fault tolerance. The proposed cTCP protocol could aggregate multiple paths' bandwidth by supporting concurrent data transfer (CDT) on a single connection. Here concurrent data transfer was defined as the concurrent transfer of data from local hosts to foreign hosts via two or more end-to-end paths. An RTT-Based CDT mechanism, which was based on a path's RTT (Round Trip Time) to optimize CDT performance, was developed for the proposed cTCP protocol. This mechanism primarily included an RTT-Based load distribution and path management scheme, which was used to optimize connections' throughput and reliability. A congestion control and retransmission policy based on RTT was also provided. According to experiment results, under different network conditions, our RTT-Based CDT mechanism could acquire good CDT performance. Finally a CWND-Based CDT mechanism, which was based on a path's CWND (Congestion Window), to optimize CDT performance was introduced. This mechanism primarily included: a CWND-Based load allocation scheme, which assigned corresponding data to paths based on their CWND to achieve aggregate bandwidth; a CWND-Based path management, which was used to optimize connections' fault tolerance; and a congestion control and retransmission management policy, which was similar to regular TCP in its separate path handling. According to corresponding experiment results, this mechanism could acquire near-optimal CDT performance under different network conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This qualitative study used grounded theory methods and purposeful sampling to explore perceptions on caring and being cared-for. Twenty-four adolescent male participants, identified as at-risk for school failure, completed a two phase interview process exploring these phenomena within three relationships; the relationship with the friend, with the most caring person they knew and with the teacher they felt cared for them. ^ Each participant was asked a predetermined set of open questions in an initial semi-structured interview. In addition each participant was encouraged to explore his own reflections on caring. A second interview allowed for member checking and for the participant to continue sharing his meaning of caring and being cared-for. ^ Line by line analysis with open, axial and selective coding was applied to interview transcripts along with a constant comparative method. Results indicated that the core category integrating all other categories was attachment bonding. Participants' stories manifested characteristics of proximity seeking, secure base, safe haven and distress upon involuntary separation from an attachment figure. ^ Strategies facilitating attachment bonding were influenced by the power positions of the relational players. Participants responded positively to the one-caring when they felt cared-for. Results further indicated that participants did not need to feel a sense of belonging in order to feel cared-for. Teacher behaviors indicating openness for authentic connections with students were specific to teacher's friendliness and professional competence. Teachers who nurtured feelings of being cared-for were uncommon in the participants' educational experience. ^ The number of adolescent males leaving high school prematurely is both a personal problem and a social problem. Despite a “mask” of indifference often exhibited by adolescent males at-risk for school failure, teachers might consider the social/emotional needs of these students when implementing the curriculum. In addition, policy makers might consider the social/emotional needs of this vulnerable population when developing programs meant to foster psychological well-being and connectedness for adolescent males at-risk for school failure. ^