News:

MyKidsDiary.in :: Capture your kids magical moment and create your Online Private Diary for your kids

Main Menu

Waana IT Hero in your office...... come on pick up ur question.......

Started by dhilipkumar, Sep 13, 2008, 11:53 AM

Previous topic - Next topic

dhilipkumar

what is Android

1) a humanoid robot.

2) A Linux-based open source platform for mobile cellular handsets. Android was developed by Google and the Open Handset Alliance (OHA), a coalition of hardware, software and telecommunications companies oriented towards advancing mobile telephony standards. More than 30 companies are involved in the OHA, including Qualcomm, Broadcom, HTC, Intel, Samsung, Motorola, Sprint, Texas Instruments and Japanese wireless carriers KDDI and NTT DoCoMo.

Android began its life as a Palo Alto-based startup company, founded in 2003. That company was subsequently acquired by Google in 2005. Co-founder Andy Rubin is perhaps best known for creating the Sidekick, one of the first Internet-capable smartphones with a QWERTY keyboard.

The Android platform includes an operating system based upon Linux, a GUI, a Web browser and many other applications considered key to a modern cellular handset. Android allows synchronization to a user's address book, calendar and other personal information management (PIM) programs, though individual software makers will have to customize their offerings. Naturally, Google Calendar and Maps will be built-in. Android will allow users to browse the Internet more easily, integrate mapping services with local business listings and use many other software features traditionally associated with personal computers rather than cellphones.

Although the initial demonstrations of Android have featured a generic QWERTY smartphone and large VGA screen, the operating system was written to run on relatively inexpensive handsets with conventional numeric keypads. Android will run on both of the most widely deployed cellular standards, GSM/HSDPA and CDMA/EV-DO. Android will also support:



  • Bluetooth
    EDGE
    3G communication protocols, like EV-DO and HSDPA
    WiFi
    SMS messaging
    MMS
    video/still digital cameras
    touchscreens
    GPS
    compasses
    accelerometers
    accelerated 3D graphics
Android includes software built by many different entities. For instance, the default Android web browser will be based on Webkit, like Apple's Safari. Webkit was originally based on the Konquerer Web browser for Linux. Its music and video playback software was developed by PacketVideo. Applications are written using Java and are run on Dalvik, a virtual machine that runs on top of a Linux kernel. Android will be released under the Apache v2 open source license.

Google's strategy with Android likely will involve a wireless system that subsidizes the service and hardware of users in exchange for geotargeted, customized advertising. To that end, Google is planning to bid in the January 2008 auction for the license on the 700 mHz spectrum in the U.S., an ideal frequency for high speed broadband Internet access. Creating a wide coalition of partners that are independent of wireless telecommunication carriers could free mobile handset users from proprietary, closed services that have been characterized by slow service or limited features, opening the door for rapid development and integration of social networking, mobile video or mcommerce features.

Wireless companies may be reluctant to carry Android devices on their networks due to security concerns, based upon widely available open source protocols. More fundamentally, threats to profitable revenue streams in ringtones, email and messaging services, games, GPS features and other competing applications could mean that carriers will not adopt the devices without significant market pressures. An open source Web browser that offered support for VoIP calls while on open WiFi networks, for instance, would be a substantial competitive threat to the wireless carriers. Android will also face a similar position to that initially enjoyed by Linux, as Symbian remains the world's largest mobile device operating system, with a global market share of 72% in the smartphone OS market during Q2 2007. Early support from Sprint (along with an upcoming deployment of WiMAX to over 54 million users) and T-Mobile, however, will help to drive adoption of Android-devices in the U.S. domestic market.

Google has applied for several patents for mobile use, including mobile contextual advertising and payment schemes. An image-based inquiry system that would allow users to scan items with an integrated camera and immediately receive identification of those items using an integrated search engine could radically change commerce and how individuals navigate commerce and the world at large. Google's Mobile Adsense program is a glimpse of this emerging technological ecosystem.

The Android software development kit (SDK) was released in November 2007. The Android SDK includes development and debugging tools, a device emulator, software libraries, documentation, sample projects, video tutorials and other FAQs. Google's immediate release of this SDK stands in stark contrast to Apple's approach with the iPhone, which was initially released as a closed system with no institutional support for third-party applications, or RIM's BlackBerry OS. Apple will, however, does plan to release an iPhone SDK in early 2008.

The embedded video below features Google's Sergey Brin and Steve Horowitz discussing the availability of the SDK, the open source prospects of the platfom and then demonstrating applications on the Android platform.

dhilipkumar

what is data encryption/decryption IC

A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data. Some such devices are intended for half-duplex operation (in which input and output do not occur simultaneously), and others are designed for full-duplex operation (where input and output can occur simultaneously).

Encryption is the conversion of data into a form, called a cipher, that cannot be understood by unauthorized people. Decryption is the process of converting encrypted data back into its original form, so it can be understood. Encryption and decryption should not be confused with encoding and decoding, in which data is converted from one form to another but is not deliberately altered so as to conceal its content.

An integrated circuit, sometimes called a chip or microchip, is a semiconductor wafer on which thousands or millions of tiny resistors, capacitors, and transistors are fabricated. These devices can perform dozens of tasks in electronics and computing.

dhilipkumar

Data integration

    Customer data integration (CDI) is the process of consolidating and managing customer information from all available sources, including contact details, customer valuation data, and information gathered through interactions such as direct marketing. Properly conducted, CDI ensures that all relevant departments in the company have constant access to the most current and complete view of customer information available. As such, CDI is an essential element of customer relationship management (CRM).

           Although many companies have been gathering customer data for a good number of years, it hasn't always been managed very effectively. As a result, companies may maintain outdated, redundant, and inconsistent customer data. According to a Forrester Research report, although 92% of companies surveyed believe having an integrated view of customer data is either "critical" or "very important," only 2% have actually managed to achieve that goal.

dhilipkumar

Application Compatibility Toolkit

Application Compatibility Toolkit (ACT) is a set of freely downloadable program utilities and related documents from Microsoft for ensuring compatibility among application programs in Windows operating systems, especially in a large network environment. The Toolkit can also be used to diagnose and fix problems that may be related to compatibility. The Toolkit can be used for applications running in Windows 2000 with Service Pack 3 or later, Windows XP, and Windows Server 2003. Microsoft says that the tools can be used to resolve over 200 symptoms of problems.
The tools consist of:

# An Application Compatibility Analyzer that is used to systematically pinpoint compatibility issues
# An Application Verifier that is used during development to ensure that no compatibility issues can be identified
# A Compatibility Administrator that is used to select and apply specific compatibility fixes and then deploy the fixes to other computers
The Toolkit can be downloaded from Microsoft's Web site or ordered for delivery on a CD. The Analyzer and the Verifier can also be downloaded or ordered separately.

dhilipkumar

What is DBMS

Database management system

A database management system (DBMS), sometimes just called a database manager, is a program that lets one or more computer users create and access data in a database. The DBMS manages user requests (and requests from other programs) so that users and other programs are free from having to understand where the data is physically located on storage media and, in a multi-user system, who else may also be accessing the data. In handling user requests, the DBMS ensures the integrity of the data (that is, making sure it continues to be accessible and is consistently organized as intended) and security (making sure only those with access privileges can access the data). The most typical DBMS is a relational database management system (RDBMS). A standard user and program interface is the Structured Query Language (SQL). A newer kind of DBMS is the object-oriented database management system (ODBMS).
A DBMS can be thought of as a file manager that manages data in databases rather than files in file systems. In IBM's mainframe operating systems, the nonrelational data managers were (and are, because these legacy application systems are still used) known as access methods.

A DBMS is usually an inherent part of a database product. On PCs, Microsoft Access is a popular example of a single- or small-group user DBMS. Microsoft's SQL Server is an example of a DBMS that serves database requests from multiple (client) users. Other popular DBMSs (these are all RDBMSs, by the way) are IBM's DB2, Oracle's line of database management products, and Sybase's products.

IBM's Information Management System (IMS) was one of the first DBMSs. A DBMS may be used by or combined with transaction managers, such as IBM's Customer Information Control System (CICS).

dhilipkumar

WHAT IS Centrino

Centrino is a technology package from Intel that provides built-in wireless support for laptop computers while making it possible to run a laptop all day (up to seven hours) without a battery recharge. Through Centrino, Intel hopes to encourage corporations and users to replace their current laptops with a newer, more mobile version. Analysts suggest that a more mobile laptop may in time replace the desktop computer as well.
The Centrino package consists of:


  • The Pentium M processor
    The 855 chipset Family
    The PRO/Wireless network connection

In addition to a 400 MHz system bus and a 1 MB L2 cache, the M processor has the ability to use only the voltage that applications demand. The 855 Chipset supports up to 2 GB of double data rate (DDR) memory and USB 2.0 for faster data transfer. The PR/Wireless connection supports Wi-Fi (802.11b) and power functions designed to maximize battery life.

One industry commentator reports an experience of up to seven hours of battery-supported use on an IBM Thinkpad.

dhilipkumar

WHAT IS charge-coupled device

A charge-coupled device (CCD) is a light-sensitive integrated circuit that stores and displays the data for an image in such a way that each pixel (picture element) in the image is converted into an electical charge the intensity of which is related to a color in the color spectrum. For a system supporting 65,535 colors, there will be a separate value for each color that can be stored and recovered. CCDs are now commonly included in digital still and video cameras. They are also used in astronomical telescopes, scanners, and bar code readers. The devices have also found use in machine vision for robots, in optical character recognition (OCR), in the processing of satellite photographs, and in the enhancement of radar images, especially in meteorology.

A CCD in a digital camera improves resolution compared with older technologies.� Some digital cameras produce images having more than one million pixels, yet sell for under $1,000. The term megapixel has been coined in reference to such cameras. Sometimes a camera with an image of 1,024 by 768 pixels is given the label "megapixel," even though it technically falls short of the mark.� Another asset of the CCD is its high degree of sensitivity.� A good CCD can produce an image in extremely dim light, and its resolution does not deteriorate when the illumination intensity is low, as is the case with conventional cameras.

The CCD was invented in 1969 at Bell Labs, now part of Lucent Technologies, by George Smith and Willard Boyle

dhilipkumar

comma-separated values file

In computers, a CSV (comma-separated values) file contains the values in a table as a series of ASCII text lines organized so that each column value is separated by a comma from the next column's value and each row starts a new line. Here's an example:

Doe,John,944-7077
Johnson,Mary,370-3920
Smith,Abigail,299-3958
(etc.)

A CSV file is a way to collect the data from any table so that it can be conveyed as input to another table-oriented application such as a relational database application. Microsoft Excel, a leading spreadsheet or relational database application, can read CSV files. A CSV file is sometimes referred to as a flat file.

dhilipkumar

change data capture (CDC)

Change data capture (CDC) is the process of capturing changes made at the data source and applying them throughout the enterprise. CDC minimizes the resources required for ETL ( extract, transform, load ) processes because it only deals with data changes. The goal of CDC is to ensure data synchronicity.

dhilipkumar

data dictionary

A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them. A first step in analyzing a system of objects with which users interact is to identify each object and its relationship to other objects. This process is called data modeling and results in a picture of object relationships. After each data object or item is given a descriptive name, its relationship is described (or it becomes part of some structure that implicitly describes relationship), the type of data (such as text or image or binary value) is described, possible predefined values are listed, and a brief textual description is provided. This collection can be organized for reference into a book called a data dictionary.

When developing programs that use the data model, a data dictionary can be consulted to understand where a data item fits in the structure, what values it may contain, and basically what the data item means in real-world terms. For example, a bank or group of banks could model the data objects involved in consumer banking. They could then provide a data dictionary for a bank's programmers. The data dictionary would describe each of the data items in its data model for consumer banking (for example, "Account holder" and ""Available credit").

dhilipkumar

10g


10g is Oracle's grid computing product group including (among other things) a database management system (DBMS) and an application server. In addition to supporting grid computing features such as resource sharing and automatic load balancing, 10g products automate many database management tasks. The Real Application Cluster (RAC) component makes it possible to install a database over multiple servers.

10g follows Oracle's 9i platform. Oracle says that the g (instead of the expected i) in the name symbolizes the company's commitment to the grid model. However, according to some reports, many early adopters are deploying 10g solely for its automation features and have no immediate plans of implementing a grid environment.

dhilipkumar

AS1


AS1 (Applicability Statement 1) is a specification for Electronic Data Interchange (EDI) communications between businesses using e-mail protocols. The specification has been largely superseded by Applicability Statement 2 (AS2). Both specifications were created by EDI over the Internet (EDIINT), a working group of the Internet Engineering Task Force (IETF) for developing secure and reliable business communications standards.
The AS1 standard provides S/MIME (Secure Multi-Purpose Internet Mail Extensions) and uses Simple Mail Transfer Protocol (SMTP) to transmit data using e-mail. Security, authentication, message integrity, and privacy are assured by the use of encryption and digital signatures. Another important feature, nonrepudiation, makes it impossible for the intended recipient of a message to deny having received it.

An Internet connection capable of sending and receiving e-mail, an EDI transfer engine, and digital certificates are required for data exchange using AS1. Almost any type of data can be transmitted.

dhilipkumar

ActiveX Data Objects

ActiveX Data Objects (ADO) is an application program interface from Microsoft that lets a programmer writing Windows applications get access to a relational or non-relational database from both Microsoft and other database providers. For example, if you wanted to write a program that would provide users of your Web site with data from an IBM DB2 database or an Oracle database, you could include ADO program statements in an HTML file that you then identified as an Active Server Page. Then, when a user requested the page from the Web site, the page sent back would include appropriate data from a database, obtained using ADO code.

Like Microsoft's other system interfaces, ADO is an object-oriented programming interface. It is also part of an overall data access strategy from Microsoft called Universal Data Access. Microsoft says that rather than trying to build a universal database as IBM and Oracle have suggested, finding a way to provide universal access to various kinds of existing and future databases is a more practical solution. In order for this to work, Microsoft and other database companies provide a "bridge" program between the database and Microsoft's OLE DB, the low-level interface to databases.

OLE DB is the underlying system service that a programmer using ADO is actually using. A feature of ADO, Remote Data Service, supports "data-aware" ActiveX controls in Web pages and efficient client-side caches. As part of ActiveX, ADO is also part of Microsoft's overall Component Object Model (COM), its component-oriented framework for putting programs together.

ADO evolved from an earlier Microsoft data interface, Remote Data Objects (RDO). RDO works with Microsoft's ODBC to access relational databases, but not nonrelational databases such as IBM's ISAM and VSAM.

dhilipkumar

Adaptive Server Enterprise

Adaptive Server Enterprise (ASE) is a relational database management system ( RDBMS ) from Sybase, Inc. that runs on Linux and other Unix -based operating systems, Windows NT and Windows 2000 , and Mac OS . ASE evolved from a program originally called Sybase SQL Server, which was first released in the 1980s. Although ASE is a proprietary program, free versions are available.

ASE is designed primarily for use on high-end server s and, according to Sybase, is especially good at handling online transaction processing ( OLTP ) workload s. ASE Version 15, released in September 2005, includes cursor scrolling, messaging services, automatic updating, specialized job wizards, very large server support (VLSS), native storaging and processing of XML documents, enhanced encryption , and a query processing engine.

Prior to 1994, Sybase SQL Server evolved along the same lines as Microsoft SQL Server . Then Microsoft bought a copy of the Sybase SQL server source code and began engineering its product along a different line. A couple of years later, Sybase renamed its product ASE (to distinguish it from the Microsoft product) and released ASE Version 11.5.

dhilipkumar


WHAT IS Andrew file system..?


An Andrew file system (AFS) is a location-independent file system that uses a local cache to reduce the workload and increase the performance of a distributed computing environment. A first request for data to a server from a workstation is satisfied by the server and placed in a local cache. A second request for the same data is satisfied from the local cache.
:D
The Andrew file system was developed at Carnegie-Mellon University.

dhilipkumar

WHAT IS Data Access Objects


DAO (Data Access Objects) is an application program interface (API) available with Microsoft's Visual Basic that lets a programmer request access to a Microsoft Access database. DAO was Microsoft's first object-oriented interface with databases. DAO objects encapsulate Access's Jet functions. Through Jet functions, it can also access other Structured Query Language (SQL) databases.


To conform with Microsoft's vision of a Universal Data Access (UDA) model, programmers are being encouraged to move from DAO , although still widely used, to ActiveX Data Objects (ADO) and its low-level interface with databases, OLE DB. ADO and OLE DB offer a faster interface that is also easier to program.

dhilipkumar

grid computing

Grid computing (or the use of a computational grid) is applying the resources of many computers in a network to a single problem at the same time - usually to a scientific or technical problem that requires a great number of computer processing cycles or access to large amounts of data. A well-known example of grid computing in the public domain is the ongoing SETI (Search for Extraterrestrial Intelligence) @Home project in which thousands of people are sharing the unused processor cycles of their PCs in the vast search for signs of "rational" signals from outer space. According to John Patrick, IBM's vice-president for Internet strategies, "the next big thing will be grid computing."

Grid computing requires the use of software that can divide and farm out pieces of a program to as many as several thousand computers. Grid computing can be thought of as distributed and large-scale cluster computing and as a form of network-distributed parallel processing. It can be confined to the network of computer workstations within a corporation or it can be a public collaboration (in which case it is also sometimes known as a form of peer-to-peer computing).

A number of corporations, professional groups, university consortiums, and other groups have developed or are developing frameworks and software for managing grid computing projects. The European Community (EU) is sponsoring a project for a grid for high-energy physics, earth observation, and biology applications. In the United States, the National Technology Grid is prototyping a computational grid for infrastructure and an access grid for people. Sun Microsystems offers Grid Engine software. Described as a distributed resource management (DRM) tool, Grid Engine allows engineers at companies like Sony and Synopsys to pool the computer cycles on up to 80 workstations at a time. (At this scale, grid computing can be seen as a more extreme case of load balancing.)

Grid computing appears to be a promising trend for three reasons: (1) its ability to make more cost-effective use of a given amount of computer resources, (2) as a way to solve problems that can't be approached without an enormous amount of computing power, and (3) because it suggests that the resources of many computers can be cooperatively and perhaps synergistically harnessed and managed as a collaboration toward a common objective. In some grid computing systems, the computers may collaborate rather than being directed by one managing computer. One likely area for the use of grid computing will be pervasive computing applications - those in which computers pervade our environment without our necessary awareness.

dhilipkumar

What is Hot backup

A hot backup, also called a dynamic backup, is a backup performed on data even though it is actively accessible to users and may currently be in a state of being updated. Hot backups can provide a convenient solution in multi-user systems, because they do not require downtime, as does a conventional cold backup.

Hot backups involve certain risks. If the data is altered while the backup is in progress, the resulting copy may not match the final state of the data. If recovery of the data becomes necessary (for example, following a system crash), the inconsistency must be resolved. The Oracle database preserves the integrity of the data by creating a so-called redo log prior to executing a hot backup and by placing the system in a special hot-backup mode while the data is copied. Performance may be degraded as the backup is taking place. Individual users may notice this as a temporary system or network slowdown.

dhilipkumar

Bayesian logic

Named for Thomas Bayes, an English clergyman and mathematician, Bayesian logic is a branch of logic applied to decision making and inferential statistics that deals with probability inference: using the knowledge of prior events to predict future events. Bayes first proposed his theorem in his 1763 work (published two years after his death in 1761), An Essay Towards Solving a Problem in the Doctrine of Chances . Bayes' theorem provided, for the first time, a mathematical method that could be used to calculate, given occurrences in prior trials, the likelihood of a target occurrence in future trials. According to Bayesian logic, the only way to quantify a situation with an uncertain outcome is through determining its probability.

Bayes' Theorem is a means of quantifying uncertainty. Based on probability theory, the theorem defines a rule for refining an hypothesis by factoring in additional evidence and background information, and leads to a number representing the degree of probability that the hypothesis is true. To demonstrate an application of Bayes' Theorem, suppose that we have a covered basket that contains three balls, each of which may be green or red. In a blind test, we reach in and pull out a red ball. We return the ball to the basket and try again, again pulling out a red ball. Once more, we return the ball to the basket and pull a ball out - red again. We form a hypothesis that all the balls are all, in fact, red. Bayes' Theorem can be used to calculate the probability (p) that all the balls are red (an event labeled as "A") given (symbolized as "|") that all the selections have been red (an event labeled as "B"):

p(A|B) = p{A + B}/p{B}

Of all the possible combinations (RRR, RRG, RGG, GGG), the chance that all the balls are red is 1/4; in 1/8 of all possible outcomes, all the balls are red AND all the selections are red. Bayes' Theorem calculates the probability that all the balls in the basket are red, given that all the selections have been red as .5 (probabilities are expressed as numbers between 0. and 1., with "1." indicating 100% probability and "0." indicating zero probability).

The International Society for Bayesian Analysis (ISBA) was founded in 1992 with the purpose of promoting the application of Bayesian methods to problems in diverse industries and government, as well as throughout the Sciences. The modern incarnation of Bayesian logic has evolved beyond Bayes' initial theorem, developed further by the 18th century French theorist Pierre-Simon de Laplace, and 20th and 21st century practitioners such as Edwin Jaynes, Larry Bretthorst, and Tom Loredo. Current and possible applications of Bayesian logic include an almost infinite range of research areas, including genetics, astrophysics, psychology, sociology, artificial intelligence ( AI ), data mining , and computer programming .

dhilipkumar

backward mapping


Backward mapping (also known as inverse mapping or screen order ) is a technique used in texture mapping to create a 2D image from 3D data. Because a texel (the smallest graphical element in a texture map) does not correspond exactly to a screen pixel , the developer must apply a filter computation to map the texels to the screen. Forward mapping steps through the texture map and computes the screen location of each texel. In comparison, backward mapping steps through the screen pixels and computes a pixel's color according to the texels that map to it.

Real time video effects systems use forward mapping. However, since many texels are likely to map to a single pixel, performing the filter computation at each pixel for every frame is very expensive. Most systems that don't have to produce real time effects use backward mapping.



dhilipkumar

WHAT IS CableCARD

ableCARD is a plug-in card approximately the size of a credit card that allows consumers in the United States to view and record digital cable television channels on digital video recorders, personal computers and televisions without the use of other equipment such as a set top box (STB) provided by a cable television company. The card, provided by the local company for a nominal monthly fee, is a PCMCIA card and looks exactly like those used with laptops.

In technical contexts, "CableCARD" refers more broadly to a set of technologies created by the United States cable television industry in response to requirements by federal government's Telecommunications Act of 1996 that cable companies allow non cable company provided devices to access their networks. Use of the term CableCARD can be confusing, because some technologies refer not to the physical card, but to a device ("Host") that uses the card. Some CableCARD technologies can be used with devices that have no physical CableCARDs.

                                 

dhilipkumar


CableCARD (still) goes unloved, except in set-top boxes

The FCC may have had the best of intentions when it decreed that cable operators would have to open their set-top boxes to competition, but intentions can't make people buy and use CableCARD products. The cable industry's quarterly report on CableCARD deployments (PDF) makes that point, as it does every quarter, simply by presenting the dismal numbers: only 18,000 people in the US requested a CableCARD over the last three months.

This brings the total number of deployments to only 392,000 when the top ten cable operators are lumped together. 596 (one-way) TVs and other host devices support the cards, which allow for the decrypting of cable channels without a cable-provided set-top box, but buyers don't seem to care.

It's not hard to see why. Cablevision's numbers, for instance, show that its CableCARD deployments increased by less than one percent over the quarter. This means that the company did less than 167 installs in the last three months, yet it recorded 2,138 CableCARD problems in its service database. Even after they are initially installed and functioning, the CableCARD ecosystem is a delicate one. Simply looking at your TV funny can cause issues as varied as blank channels, incomplete channel lineups, and inscrutable onscreen error codes.

It's not even the cards themselves that are at fault most of the time. Cablevision's service database showed that 49 percent of all such problems were related to the host devices, 39 percent to the CableCARDs, and another 12 percent to the Cablevision network. Couple these problems with the $46.95 install fee, the need to wait for a famously-punctual cable tech to show up and install the card into the TV, and the $2.00 monthly rental fee and CableCARD hasn't proved compelling to most consumers (though it hasn't shown up in many mainstream TV sets, either).

Where CableCARD has taken off is in those cable company set-top boxes that CableCARD was meant to liberate us from. The FCC even went so far as to lay down an "integration ban" back in 2007 that required cable operators to start using CableCARDs in their own boxes. The goal was to give cable a real incentive to make sure CableCARDs worked well and that CableCARD devices could compete on a level playing field. The industry screamed about the cost this would impose, but eventually had to give in, and now it takes every opportunity to tell the FCC how stupid cable thinks the whole arrangement is.


When cable files its quarterly CableCARD reports, therefore, we often have the chance to read paragraphs like this: "By contrast, since the 'integration ban' went into effect on July 1, 2007, those 10 companies have already deployed more than 9,766,000 operator-supplied set-top boxes with CableCARDs. Therefore, in less than 18 months, cable operators have deployed more than 24 times as many CableCARD-enabled devices than the total number of CableCARDs requested by customers for use in UDCPs in just over the last four years."

Translation: "You forced the entire industry to adopt this tech, you made us rework our cable boxes, and now all we have to show for it are more expensive set-top boxes; consumers aren't using anything else."

In addition to appearing in high-end products, CableCARDs have generally not been able to access two-way features such as interactive programming guides and pay-per-view movies, making them less attractive. It also doesn't help that cable operators have sometimes adopted switched digital video (SDV) systems that save bandwidth while breaking many third-party CableCARD devices.

But the basic technology isn't going away. Even as consumer electronics manufacturers have joined cable on the "tru2way" bandwagon, CableCARDs will still be needed to decrypt protected content. Two-way features like the program guide might work as soon as you plug that shiny new tru2way TV into your cable connection, but you won't be watching HBO without experiencing the joys of the CableCARD.


dhilipkumar

WHAT IS MDU (multi-dwelling unit):


multi-dwelling unit broadcasting signal distribution system

The present invention relates to MDU broadcasting signal distribution system, and more specifically, to MDU system that receives a broadcasting signal through a single broadcasting receiving antenna in a multiplex house or building where an IP network is provided.

And then transmits the broadcasting signal to each MDU set-top box through the IP network. According to the invention, when an IP network is provided in a multiplex house or building, without installing a dish-shaped antenna or cable, the set-top box is connected to the IP network so as to receive satellite broadcasting, terrestrial DMB, and cable broadcasting. Therefore, broadcasting signals can be effectively distributed to each home such that desired broadcasting can be enjoyed in the home. Further, by applying IGMP, broadcasting can be selectively provided to an MDU set-top box belonging to a specific group.

dhilipkumar

1. Multi-dwelling unit (MDU) broadcasting signal distribution system which distributes a broadcasting signal to one or more MDU set-top boxes, which comprises: a broadcasting signal receiving unit for receiving a broadcasting signal;

An MDU module for demultiplexing MPTS into SPTS, which MPTS is included in the broadcasting signal, for packetizing the demultiplexed broadcasting signal by applying UDP, for allocating IP address to each channel of the broadcasting signal so as to create IP mapping table, and then for packetizing the IP mapping table so as to transmit the packetized IP mapping table and the packetized broadcasting signal in multicast;

One or more L3 switches, directly connected to the MDU module, for receiving packet signal from the MDU module, and then for distributing and transmitting the received packet signal;

One or more L2 switches, directly connected to MDU set-top boxes through Ethernet network, for receiving the packet signal from the L3 switch, and then for providing the received packet signal to the MDU set-top boxes;

MDU set-top boxes, including: an Ethernet port for receiving packet signal from the L2 switch; an IP mapping table recognition unit for extracting IP mapping table from the packet signal, and when a broadcasting channel is selected by an external operation, for receiving signal corresponding to the IP address of the broadcasting channel by referring to the IP mapping table; a central processing unit for extracting broadcasting signal from the packet signal and then for converting the broadcasting signal into a signal that can be MPEG-processed;

A tuner for directly receiving a broadcasting signal from outside; a demultiplexer for demultiplexing the signals provided from the tuner and the central processing unit; a decoder for decoding the broadcasting signal demultiplexed by the demultiplexer;

And an A/V output unit for outputting the decoded A/V broadcasting signal to the outside.

2. The MDU broadcasting signal distribution system according to Claim 1, wherein the MDU module includes: a plurality of distribution modules for receiving the distributed broadcasting signal; a TS-to-IP processing module, including: a demultiplexer for extracting MPTS from the broadcasting signal input through the distribution module and then for demultiplexing the extracted MPTS into SPTS; and a protocol processing unit for applying UDP to the demultiplexed broadcasting signal so as to create a packet signal; a control module for controlling so as to let the packet signal, generated from the TS-to-IP processing module, be output to the outside in the multicast scheme; and a switch module, connected to the L3 switch through an IP network, for transmitting the packet signal to the L3 switch.

3. The MDU broadcasting signal distribution system according to Claim 2, wherein, when an IGMP customer registration signal provided from an MDU set-top box is received through the L2 switch, the L3 switch sets the MDU set-top box to a destination and transmits a corresponding packet signal to the MDU set-top box through the L2 switch.

dhilipkumar

                 CORBA   

Common Object Request Broker Architecture (CORBA) is an architecture and specification for creating, distributing, and managing distributed program objects in a network. It allows programs at different locations and developed by different vendors to communicate in a network through an "interface broker." CORBA was developed by a consortium of vendors through the Object Management Group (OMG), which currently includes over 500 member companies. Both International Organization for Standardization (ISO) and X/Open have sanctioned CORBA as the standard architecture for distributed objects (which are also known as components). CORBA 3 is the latest level.

The essential concept in CORBA is the Object Request Broker (ORB). ORB support in a network of clients and servers on different computers means that a client program (which may itself be an object) can request services from a server program or object without having to understand where the server is in a distributed network or what the interface to the server program looks like. To make requests or return replies between the ORBs, programs use the General Inter-ORB Protocol (GIOP) and, for the Internet, its Internet Inter-ORB Protocol (IIOP). IIOP maps GIOP requests and replies to the Internet's Transmission Control Protocol (TCP) layer in each computer.

A notable hold-out from CORBA is Microsoft, which has its own distributed object architecture, the Distributed Component Object Model (DCOM). However, CORBA and Microsoft have agreed on a gateway approach so that a client object developed with the Component Object Model will be able to communicate with a CORBA server (and vice versa).

Distributed Computing Environment (DCE), a distributed programming architecture that preceded the trend toward object-oriented programming and CORBA, is currently used by a number of large companies. DCE will perhaps continue to exist along with CORBA and there will be "bridges" between the two.

More information is available in our definitions of Internet Inter-ORB Protocol and Object Request Broker.

dhilipkumar

OS X widget

An OS X widget is a downloadable, interactive virtual tool that provides services or increased functionality within the Apple operating system. Essentially, widgets are miniature applications that allow the user to perform common tasks easily and access information quickly. OS X 10.3 and 10.4 automatically revert to the Dashboard after a set period of time, bringing whatever widget s that are installed and active to the foreground.

In their current incarnation, widgets provide services including -- but by no means limited to -- the following:


  • Lists of the latest news headlines or RSS news feeds.
    Customizable weather forecasting, with radar and reflectivity real-time tracking.
    Mapping, including access to mash-up s of geography and other data and Microsoft's Virtual Earth.
    iTunes media content playback and listings.
    Sticky notes.
    Stock market activity tracking.
    Travel information, including links to flight tracking and airport status reports.
    Language translation.
    Real-time sports scores.
    Customizable interfaces for listening to Internet radio stations or podcasts.
    Direct search for Wikipedia, Dictionary.com and other references.
    Wireless network scanning.
    IPTV browsing and viewing.
    eBay auction monitoring.
    MySpace searches.
Early examples of widgets existed as desktop accessories on earlier version of the Macintosh OS. These widgets were written as device drivers, allowing a Mac user some multitasking ability.

Microsoft's new OS, Vista , is expected to incorporate widgets in some form, under the heading of "gadgets."

dhilipkumar

Cell phone, smartphone -- what's the difference?

What is a smartphone? The answer is not so simple, judging by the number of definitions available. In fact, it can be a bit of a mystery.All the popular definitions rely on the fundamental understanding that a smartphone brings together a computer with a wireless voice device. Everyone agrees on that.

But there are many nuances that separate a smartphone from a standard wireless phone, which also can incorporate some kind of a computer with wireless voice capability.Mobile industry analysts use these subtle distinctions to determine how to count smartphones separately from other wireless phones. For example, they are able to say that a wireless phone, such as the LG Rumor2, which goes on sale by Sprint Nextel Corp. on Sunday, is technically not a smartphone, although it provides access to e-mail, Internet browsing and a Qwerty keyboard.

The iPhone, just about any BlackBerry, and Nokia N or E series devices are considered smartphones, at least according to Gartner Inc. and IDC, the biggest market research firms monitoring wireless phone and smartphone shipments.

The CTIA, an industry association representing hundreds of wireless device makers and wireless carriers, uses a simple approach (possibly the simplest) in its glossary. It defines smartphones as "wireless phones with advanced data features and often keyboards." It adds, "What makes the phone 'smart' is its ability to manage and transmit data in addition to voice calls."However, a CTIA spokeswoman said there is apparently no industrywide standard definition for a smartphone and that the CTIA's glossary definition is "general."

Four industry analysts interviewed for this story said the word smartphone is indeed a term of art, subject to the many changes that have been made in wireless handhelds since 2000, when Palm Inc. started adding voice capabilities to its personal digital assistants.

"Smartphone is one of those terms of art that gets bantered about so often," said Ramon Llamas, an IDC analyst.

IDC conducted a survey of consumers last summer and discovered many different interpretations. For some people, a smartphone has to be able to access the Internet wirelessly, while others think it has to handle text messaging or allow typing on a touch screen or actual keyboard, Llamas said.

"When you talk to the folks on Mainstreet U.S.A., it's a real can of worms," he said. "There is still a lot of confusion as to what counts as a smartphone."

IDC first coined the term converged mobile device in 2002 to avoid using the term smartphone, which Microsoft Corp. was using to describe enterprise-focused wireless handhelds, Llamas said. The definition IDC developed has gone through several updates since then, with a key change in 2006 that added the requirement that a converged mobile device include a "high-level operating system."

dhilipkumar

Cell phone, smartphone -- what's the difference?


Today's definition from IDC for a converged mobile device, which is IDC's equivalent to smartphone in IDC press releases on phone shipments, reads, "A subset of mobile phones, converged mobile devices feature a high-level operating system that enables the device to run third-party applications in addition to voice telephony. Examples of high-level operating systems include Android, BlackBerry, Linux, Mac OS X, Palm, Symbian, and Windows Mobile. Converged mobile devices share many features with traditional mobile phones, including personal information management, multimedia, games, and office applications, but the presence of a high-level operating system differentiates these devices from all others."

Llamas said the definition of "high-level OS" has three parts. "High level is the linchpin of the definition," Llamas said.

A high-level OS, as IDC defines it, means that the OS has to be able to run third-party applications, not just those written by the OS maker; the applications must be able to run on the phone independent of the wireless network; and the OS must be able to run multiple applications concurrently.

By comparison, Gartner Inc. uses a written definition for both entry-level and feature smartphones, with a similar mention of a more powerful OS as an important distinction. Gartner says an entry-level smartphone must run on an open operating system, while the feature smartphone adds support for one or more functions, such as music, video, gaming, pictures, Internet browsing, mobile TV, navigation and messaging. They usually have "larger displays, more powerful processors, more embedded memory and better battery capacity."

Gartner also says the feature smartphones can have a touch screen or a full Qwerty keyboard, but neither one of those is a requirement.

Both IDC and Gartner analysts agreed that the LG Rumor2 is not a smartphone.

Ken Dulaney, a Gartner analyst, said the Rumor2 is "probably not" a smartphone because it doesn't have a "market recognized" operating system or published APIs.

And Llamas said that while the LG Rumor's operating system is "a well-developed proprietary OS," it still isn't a "high-level" OS in IDC's parlance.

Ryan Reith, also an IDC analyst, said the Rumor2 isn't a smartphone because it doesn't support third-party applications. "There's no real opportunity to get to the core of that Rumor OS and allow consumers to use third-party applications of their choice," Reith said.

Reith noted that another defining characteristic of smartphones is that they are beginning to include an applications processor, a piece of hardware that allows the smartphone to run multiple applications at one time.

Even the device maker, LG Electronics, and the carrier, Sprint Nextel, aren't calling the Rumor2 a smartphone, but their reasons don't follow the same lines as the analysts.

A Sprint spokeswoman said the Rumor2 might seem to qualify as a smartphone but that Sprint has avoided using the term "just because there's not a good definition of smartphone" that is widely agreed upon.

An LG spokeswoman came up with a fairly specific reason why the Rumor2 is not a smartphone. In an e-mail, she wrote, "This particular device [the Rumor2] is not considered a smartphone. There is not a true definition of a smartphone, but it is generally accepted that a 'smartphone' is one that can sync more than one email account (Webmail, Gmail, etc.) onto your phone. This phone, while it does have Internet access, does not sync email onto the desktop."

Reith said LG's reasoning supports IDC's finding that the Rumor2 doesn't have a high-level OS in the sense that its OS does not allow applications to run entirely on the phone separate from the network. Sprint notes in its specifications sheet for the Rumor2 that access to Microsoft Exchange and Lotus Notes comes through Sprint's Mobile Email Work.

With the addition of software, Sprint could have changed that capability but chose not to, Reith noted.

In summary, just about everyone agrees that there is no precise, standard definition of the smartphone. Llamas said IDC's take has been criticized and praised alike from many parties.

Even though there are disparities in some definitions, analysts tend to report roughly the same numbers for shipments of smartphones, Reith said. Part of the reason is that analysts pay attention to one another's numbers and to what the vendors call a smartphone, Reith and Llamas said.

Reith said he couldn't think of a single device categorized by IDC as a smartphone to which Gartner or other major analyst firms wouldn't agree.

Still, the analysts acknowledged that the question of what a smartphone is can be confusing and even mysterious for the public. One analyst said that the CTIA's definition "probably needs to be updated," but Llamas said picking a proper definition can be a delicate matter.

"I'll respect others' definitions, and I'll stick with mine," Llamas said, laughing. "I'm being diplomatic."

dhilipkumar

What is Data center

Data centers have their roots in the huge computer rooms of the early ages of the computing industry. Early computer systems were complex to operate and maintain, and required a special environment in which to operate. Many cables were necessary to connect all the components, and methods to accommodate and organize these were devised, such as standard racks to mount equipment, elevated floors, and cable trays (installed overhead or under the elevated floor). Also, old computers required a great deal of power, and had to be cooled to avoid overheating. Security was important – computers were expensive, and were often used for military purposes. Basic design guidelines for controlling access to the computer room were therefore devised.

During the boom of the microcomputer industry, and especially during the 1980s, computers started to be deployed everywhere, in many cases with little or no care about operating requirements. However, as information technology (IT) operations started to grow in complexity, companies grew aware of the need to control IT resources. With the advent of client-server computing, during the 1990s, microcomputers (now called "servers") started to find their places in the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for network cabling, made it possible to use a hierarchical design that put the servers in a specific room inside the company. The use of the term "data center," as applied to specially designed computer rooms, started to gain popular recognition about this time.

The boom of data centers came during the dot-com bubble. Companies needed fast Internet connectivity and nonstop operation to deploy systems and establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called Internet data centers (IDCs), which provide businesses with a range of solutions for systems deployment and operation. New technologies and practices were designed to handle the scale and the operational requirements of such large-scale operations. These practices eventually migrated toward the private data centers, and were adopted largely because of their practical results.

As of 2007, data center design, construction, and operation is a well-known discipline. Standard documents from accredited professional groups, such as the Telecommunications Industry Association, specify the requirements for data center design. Well-known operational metrics for data center availability can be used to evaluate the business impact of a disruption. There is still a lot of development being done in operation practice, and also in environmentally-friendly data center design.

Data center classification

The TIA-942:Data Center Standards Overview describes the requirements for the data center infrastructure. The simplest is a Tier 1 data center, which is basically a computer room, following basic guidelines for the installation of computer systems. The most stringent level is a Tier 4 data center, which is designed to host mission critical computer systems, with fully redundant subsystems and compartmentalized security zones controlled by biometric access controls methods. Another consideration is the placement of the data center in a subterranean context, for data security as well as environmental considerations such as cooling requirements.[1]


dhilipkumar

Data center : Physical layout

A data center can occupy one room of a building, one or more floors, or an entire building. Most of the equipment is often in the form of servers mounted in 19 inch rack cabinets, which are usually placed in single rows forming corridors between them. This allows people access to the front and rear of each cabinet. Servers differ greatly in size from 1U servers to large freestanding storage silos which occupy many tiles on the floor. Some equipment such as mainframe computers and storage devices are often as big as the racks themselves, and are placed alongside them. Very large data centers may use shipping containers packed with 1,000 or more servers each; when repairs or upgrades are needed, whole containers are replaced (rather than repairing individual servers).

Local building codes may govern the minimum ceiling heights.




The physical environment of a data center is rigorously controlled:

# Air conditioning is used to control the temperature and humidity in the data center. ASHRAE's "Thermal Guidelines for Data Processing Environments" recommends a temperature range of 20–25 °C (68–75 °F) and humidity range of 40–55% with a maximum dew point of 17°C as optimal for data center conditions. The electrical power used heats the air in the data center. Unless the heat is removed, the ambient temperature will rise, resulting in electronic equipment malfunction. By controlling the air temperature, the server components at the board level are kept within the manufacturer's specified temperature/humidity range. Air conditioning systems help control humidity by cooling the return space air below the dew point. Too much humidity, and water may begin to condense on internal components. In case of a dry atmosphere, ancillary humidification systems may add water vapor if the humidity is too low, which can result in static electricity discharge problems which may damage components. Subterranean data centers may keep computer equipment cool while expending less energy than conventional designs.

# Backup power consists of one or more uninterruptible power supplies and/or diesel generators.

# To prevent single points of failure, all elements of the electrical systems, including backup system, are typically fully duplicated, and critical servers are connected to both the "A-side" and "B-side" power feeds. This arrangement is often made to achieve N+1 Redundancy in the systems. Static switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure.

# Data centers typically have raised flooring made up of 60 cm (2 ft) removable square tiles.The trend is towards 80–100cm (31.5–39.4in) void to cater for better and uniform air distribution. These provide a plenum for air to circulate below the floor, as part of the air conditioning system, as well as providing space for power cabling. Data cabling is typically routed through overhead cable trays in modern data centers. But some are still recommending under raised floor cabling for security reasons and to consider the addition of cooling systems above the racks in case this enhancement is necessary. Smaller/less expensive data centers without raised flooring may use anti-static tiles for a flooring surface.

# Data centers feature fire protection systems, including passive and active design elements, as well as implementation of fire prevention programs in operations. Smoke detectors are usually installed to provide early warning of a developing fire by detecting particles generated by smoldering components prior to the development of flame. This allows investigation, interruption of power, and manual fire suppression using hand held fire extinguishers before the fire grows to a large size. A fire sprinkler system is often provided to control a full scale fire if it develops. Fire sprinklers require 18" of clearance (free of cable trays, etc.) below the sprinklers. Fire sprinklers are typically spaced 14 feet apart.[citation needed] Clean agent fire suppression gaseous systems are sometimes installed to suppress a fire earlier than the fire sprinkler system. Passive fire protection elements include the installation of fire walls around the data center, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems, or if they are not installed.

# Physical security also plays a large role with data centers. Physical access to the site is usually restricted to selected personnel, with controls including bollards and mantraps.Video camera surveillance and permanent security guards are almost always present if the data center is large or contains sensitive information on any of the systems within.



dhilipkumar

Data center : Network infrastructure

Communications in data centers today are most often based on networks running the IP protocol suite. Data centers contain a set of routers and switches that transport traffic between the servers and to the outside world. Redundancy of the Internet connection is often provided by using two or more upstream service providers (see Multihoming).

Some of the servers at the data center are used for running the basic Internet and intranet services needed by internal users in the organization, e.g., e-mail servers, proxy servers, and DNS servers.
Network security elements are also usually deployed: firewalls, VPN gateways, intrusion detection systems, etc. Also common are monitoring systems for the network and some of the applications. Additional off site monitoring systems are also typical, in case of a failure of communications inside the data center.


Applications

The main purpose of a data center is running the applications that handle the core business and operational data of the organization. Such systems may be proprietary and developed internally by the organization, or bought from enterprise software vendors. Such common applications are ERP and CRM systems.

A data center may be concerned with just operations architecture or it may provide other services as well.
Often these applications will be composed of multiple hosts, each running a single component. Common components of such applications are databases, file servers, application servers, middleware, and various others.

Data centers are also used for off site backups. Companies may subscribe to backup services provided by a data center. This is often used in conjunction with backup tapes. Backups can be taken of servers locally on to tapes., however tapes stored on site pose a security threat and are also susceptible to fire and flooding. Larger companies may also send their backups off site for added security. This can be done by backing up to a data center. Encrypted backups can be sent over the Internet to another data center where they can be stored securely.

For disaster recovery, several large hardware vendors have developed mobile solutions that can be installed and made operational in very short time. Vendors such as Cisco Systems, Sun Microsystems, and IBM have developed systems that could be used for this purpose.

dhilipkumar

What is DTS-WAV files

What exactly is a DTS file? DTS is a multi-channel, usually 5.1 or 7.1, home theater lossy encoded audio format created by Digital Theater Systems.

DTS is available on DVDs, but not lossleess, and for this guide we will be playing back DTS-WAV files which have come from enhanced audio-CDs or from concert DVDs or DTS CDs.

It is important to have AC3Filter installed. AC3Filter is an open source AC3 decoding filter that allows you to watch videos with AC3-encoded surround audio.

After you install the filter, all video players, including Microsoft's Windows Media Player, that use DirectShow, should be able to play AC3 and DTS audio correctly. Filter also supports ProLogicII audio as well.

After you have installed AC3Filter, open AC3Filter Config. (Start->All Programs->AC3Filter->AC3 Filter Config) and head to System. Then make sure DTS is selected.


aadhar

Blu-ray is the successor to DVD. The standard was developed collaboratively by Hitachi, LG, Matsushita (Panasonic), Pioneer, Philips, Samsung, Sharp, Sony, and Thomson. It became the default optical disk standard for HD content and optical data storage after winning a format war with HD-DVD, the format promoted by Toshiba and NEC.

The format's name comes from the fact that a blue laser reads from and writes to the disc rather than the red laser of DVD players. The blue laser has a 405 nanometer (nm) wavelength that can focus more tightly than the red lasers used for writable DVD. As a consequence, a Blu-ray disc can store much more data in the same 12 centimeter space. Like the rewritable DVD formats, Blu-ray uses phase change technology to enable repeated writing to the disc.

dhilipkumar

thanks for the detail explanation about blue ray....

keep share wit us.....

dhilipkumar

USB 3.0 - You Need to Know

now there's an upgrade to USB on the way. Here's what you need to know about the coming USB 3.0.

• It's fast: Dubbed Super-Speed USB, it will offer transfer speeds of 4.8 Gbps compared with High-Speed USB 480Mbps transfer speeds.

• It's backwards compatible: Your existing USB 2.0 stuff will also work on the 3.0 ports and vice versa, although you won't get the "super speeds."

• It's coming soon: Vendors will ship some boards at the end of this year, so mainstream consumers should see them on their computers and certain devices starting in 2010.

• It's powerful: Like USB 2.0, it will transmit electricity, which means you can still use it to charge your gadgets.

• It's energy efficient: It supports reduced power operation and an idle power mode, but it will still make your CPU work like crazy to help it reach those fast data transfer speeds.

• It's backed by all vendors: Early on, both AMD and Nvidia were kind of miffed at Intel for holding back on some of the specification details, but that's all over, and everyone's now on board.

• It will end the longing for FireWire's resurrection: The faster speeds will mean that sending data to an external hard drive isn't as grindingly slow.

• Or will it instead keep the FireWire flame lit? Without the threat of FireWire competing against USB products, it's possible we won't see prices for technology drop as rapidly as they did with previous generations.

• Devices that generate big data will be the first to appear with the standard. Large flash drives, hard drives, video cameras and high-end cameras will be the first to have the technology because they can benefit from faster data transfer rates.

• It's a way to create the anti-cloud: Instead of accessing everything online either through downloads or streaming, you can store gobs of content on hard drives, and have relatively fast access to it with USB cables. That might be handy if strict data caps are implemented or you think you'll be without broadband for a while.

dhilipkumar

Petaflop -

A petaflop is a measure of a computer's processing speed and can be expressed as:
A quadrillion (thousand trillion) floating point operations per second (FLOPS)

◙ A thousand teraflops
◙ 10 to the 15th power FLOPS
◙ 2 to the 50th power FLOPS
In June, 2008, IBM's Roadrunner supercomputer was the first to break what has been called "the petaflop barrier." In November 2008, when the annual rankings of the Top 500 supercomputers were released, there were two computers to do so. At 1.105 petaflops, Roadrunner retained its top place from the previous list, ahead of Cray's Jaguar, which ran at 1.059 petaflops.

Breaking the petaflop barrier is expected to have profound and far-reaching effects on the future of science. According to Thomas Zacharia, head of computer science at Cray's Oak Ridge National Laboratory in Tennessee, "The new capability allows you to do fundamentally new physics and tackle new problems. And it will accelerate the transition from basic research to applied technology."

techtarget

nuttewz

That's great news!


 :agree  :agree  :agree








ไฮไลท์บอล คู่เด็ด ทุกคู่ ทุกแมต ไม่พลาด ได้ที่นี่ ไฮไลท์บอล livescore

pradeep prem

its really amazing good to know this
his should have this easy to get question