DDR3 Is The Best in 2008

The New DDR In electronic engineering, DDR3 SDRAM or double-date-rate three synchronous dynamic random access memory is a random access memory technology used for high speed storage of the working data of a computer or other digital electronic device.
DDR3 is part of the SDRAM family of technologies and is one of the many DRAM (dynamic random access memory) implementations. DDR3 SDRAM is an improvement over its predecessor, DDR2 SDRAM.
The primary benefit of DDR3 is the ability to run its I/O bus at four times the speed of the memory cells it contains, thus enabling faster bus speeds and higher peak throughput than earlier memory technologies. However, greater bus speed and throughput is achieved at the cost of higher latency. In addition, the DDR3 standard allows for chip capacities of 512 megabits to 8 gigabits, effectively enabling a maximum memory module size of 16 gigabytes.
DDR3 memory promises a power consumption reduction of 30% compared to current commercial DDR2 modules due to DDR3's 1.5 V supply voltage, compared to DDR2's 1.8 V or DDR's 2.5 V. The 1.5 V supply voltage works well with the 90 nanometer fabrication technology used for most DDR3 chips. Some manufacturers further propose using "dual-gate" transistors to reduce leakage of current.
According to JEDEC the maximum recommended voltage is 1.575 volts and should be considered the absolute maximum when memory stability is the foremost consideration, such as in servers or other mission critical devices. In addition, JEDEC states that memory modules must withstand up to 1.975 volts before incurring permanent damage, although they may not actually function correctly at that level.
The main benefit of DDR3 comes from the higher bandwidth made possible by DDR3's 8 bit deep prefetch buffer, in contrast to DDR2's 4 bit prefetch buffer or DDR's 2 bit buffer.
Theoretically, DDR3 modules can transfer data at the effective clock rate of 800–1600 MHz using both rising and falling edges of a 400–800 MHz I/O clock. In comparision, DDR2's current range of effective data transfer rate is 400–800 MHz using a 200–400 MHz I/O clock, and DDR's range is 200–400 MHz based on a 100–200 MHz I/O clock. To date, the graphics card market has been the driver of such bandwidth requirements, where fast data transfer between framebuffers is required.
DDR3 prototypes were announced in early 2005. Products in the form of motherboards are appearing on the market as of mid-2007 based on Intel's P35 "Bearlake" chipset and memory DIMMs at speeds up to DDR3-1600 (PC3-12800). AMD's roadmap indicates their own adoption of DDR3 in 2008.
DDR3 DIMMs have 240 pins, the same number as DDR2, and are the same size, but are electrically incompatible and have a different key notch location.

taken from : http://id.wikipedia.org/wiki/MIRC


Enrich Your Computing Style with Acer Laptops

The computers have always been the most sought after gadgets among the people, as they have the potential to perform various high end tasks within few minutes. With these gadgets a person can efficiently perform various tasks without any hassles. In fact, the computers have rapidly geared down the manual labourers in various industries and firms. However, these widgets are very heavy and it is not always possible to carry them wherever we want. As such, the laptops have been innovated to make people more techno savvy. These high end mechanical devices are enhanced with all the advanced features to offer maximum satisfaction to the users.

As the expectations of the people keep on increasing, the developments in the field of technology also managed to keep the similar pace. In fact, almost everyday, a technically advanced gadget is launched to enrich the lifestyle and the working style of the users. However we always search for the best options and never compromise with the quality and service. As the laptops are the latest craze of the people these days, it is also very important at the same time to purchase the best and a branded gadget. The Acer laptops are ruling the market with its varied high tech features.

Amidst all the top notch brands like Dell, HP, Sony etc, the Acer seems to have created ripples in the consumers' minds due to its innovative features. It is also very essential to check the quality and price of the gadgets while entering into any purchasing process. The Acer first made its presence felt in the year 2000 though it has been in the market since 1975, after that there was no looking back. This brand has been acclaimed globally due to its quality and price. It has launched various series of laptops to meet the requirements of every class of professionals. The latest Gemstone series has enjoyed enormous popularity due to its high end technical specifications. This series has been specially launched for the mid level segment of consumers. Moreover, this company is offering innovative features in its widgets like LCD panels, Blu-ray drives and also unique touch panel to enhance the multimedia management.

The acer laptops are basically outfitted with glossy top panel and attractive LCD panel. Moreover, the latest Acer AS 4310 and AS 4710 series are brilliantly targeted to empower the common people with various high end computing technologies and are fabricated on the basis of the innovative 'Pebble' design. The series supports LINUX operating system and is endowed with Intel Core2 Duo mobile processor T5500 and amazing hard disk drive of 160 GB. These gadgets can be availed at very reasonable rates without any hassles. The most attractive range of Acer is the Aspire 4310 series as it uses Celeron M 520. These notebooks are adorned with a system memory of 512 MB, elegant wide screen and hard disk drive of 80 GB. Now The most attractive range of Acer is the Aspire 5920 series as it uses Windows Vista and with Core2 Duo mobile processor T7500, 15.4 Widescreen, and with 4 GB Memory. That is Amazing.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=938628


By playing your favorite video games you can earn easy money.

I had always liked to play games because for me they have not only refreshed my mind but also made me lot happier and successful in life. Now you must be wondering how games can make one happier and successful in life. For this there is a simple answer to it, that today people pay you for playing games. Doesn't it attract you? Today, the gaming industry is on rise.It is growing and growing and growing.

Just imagine getting a favorite game before the actual release the game without spending a single penny on it and getting first hand experience on the game. Isnt it terrific and wonderful to do? Today the gaming industry is worth 35-50 million dollars. If someone is getting a job as a game tester in it, then it will be worth his pay.

Game Testing nowadays is consider to be one serious and exciting the career. Because in game testing you get to play on the latest games, get to find bugs in it and get paid for playing and bugs in the game. And you know the strategies, game plans and stories of the game before actual release, that only game tester know.

There are a lot of people that are game testers and get paid to test games right from there own home. Companies hire random people to test there games that aren't on sale yet so that the companies can get feedback on the quality and playability of the games before they release them to the public.

Today building games in this competitive market needs a lots investment, man power and skills. Companies don't want to design a game and release it and no one buys it so they aren't making sales on it therefore they lose money. If a person is able to find out a bug in a game or an error in the game hen he/she not only save money of the producer who is going to develop the game but also make money for herself or himself.

Seeing all this you might be wondering what are the requirements to qualify among those who are been paid to test video games before they are released, the requirements are just that you must know how to play video games and you must have the knowledge of computers and internet. In end of this you have to just give your opinions on games to the manufacturers;you must also have an internet access to your computer.

So, if your gaming freak or know bit about gaming, then give try to gaming testing. The best place to start your career from here iƒ Click Here!

Then what are you waiting for? Dont miss these wonderful opportunities provide by the site, wherein you are been provide with game demos or actual game before hand for testing it? Also, your been paid for playing those games and for your valuable opinions also. They also give away lots of bonus prizes such as Play Station 3 with 60 GB HDD.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=937798


Make an object seem to disappear behind another object

You can make an object seem to disappear behind another object, or appear from behind that object. This effect creates a 3D impression and seems quite magical.

In order for this to work, you need to create the appearance of a 3D environment that your object can disappear behind. This takes some experimentation. You also need to play with the order of the objects, by right-clicking and using the Order item on the shortcut menu.

To create this animation, follow these steps:

1. Select the object that you want to animate.
2. Choose Slide Show> Custom Animation. (In 2007, go to Animations tab> Custom Animation.)
3. In the Custom Animation task pane, choose Add Effect>Exit>Peek Out. (If Peek Out isn't on the list, click More Effects.)
4. Specify the Start, Direction, and Speed settings. I used On Click, To Bottom, and Slow, respectively.

If it doesn't look right, fiddle with the position of the objects and their order on the slide.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=941509


Types of Software Testing

A. Black Box

Acceptance Testing Exploratory Testing Functional Testing Integration Testing Performance Testing

Load Testing Stress Testing Volume Testing

Regression Testing Smoke Testing Usability Testing Sanity Testing Installation Testing System Testing UI

B. White Box

i. Unit Testing ii. Coverage Testing iii. Basis Path Testing

1. Flow Graph Notation 2. Cyclomatic Complexity 3. Deriving Test Cases

iv. Control Structure testing.

1. Conditions Testing 2. Data Flow Testing 3. Loop Testing


Visit Discussion Forum For More Software Tips :


taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=926437


You Can Do To Repair And Recover Your Windows Registry

If you're reading this article, it probably means you're not happy. You've probably just lost some work, saw that infamous blue screen of death, or just got some kind of error message about some corrupted registry values or something. No matter what, you're here because your computer is having some major issues and you need to recover your windows registry to correct things.

Recovering from a registry error:

Fortunately, recovering the windows registry isn't that difficult. In fact, there are a ton of recovery programs out there that can help you, and some of them are even free. First off, before we start tinkering with the windows registry (it's the database that lists everything on your computer and helps everything work together, so it's not something you want to mess with unless you're sure it's the problem), I recommend doing a scan of your registry to see if the issue truly lies there. You can find reviews of a number of free windows registry scanners at our website

After the scan has completed, the program will give you a list of different repair choices to fix any problem it might have found. If the registry scanner you're using tries to redirect you to another site where you have to buy the full license to fix the problem you should know that there are some that will complete the fix for free.

If you need an idea of which Windows registry tool to get, you can read reviews and learn about them on our website. We will tell you which programs are free or offer a trial period. This way, you can fix your registry programs and decide if you want to buy the full version of a registry tool or not.

After the recovery is complete:

Now that your registry is fixed, you can go on working with no problems, right? Wrong. Registry problems are likely to crop up again, especially if you add or delete a lot of software or hardware from your computer. You should keep that Windows registry repair tool handy since you'll probably need it in a few months.

Why do these errors keep coming back? The registry, as mentioned earlier, lists everything on your computer. The more programs and hardware you add, the more entries in your registry. The way the registry is set up, these entries should be deleted when you remove any of these programs or devices. However, that doesn't always happen. Viruses, adware, spyware, and other unwanted things can also corrupt the registry, adding to your problems.

Registry protection:

So how can you keep this from happening? Simple: use that registry scanner you found on a regular basis. You don't need to do it every day, but you should scan your windows registry at least once a month if not every two weeks or so. Doing this will keep your computer running smoothly and insure that you don't lose all your work due to a crash.

In addition to regularly running a Windows registry scan, you'll want to continue to scan for spyware and viruses. Doing all of these scans at once will really clean out your computer, and if you can set your computer to do these scans automatically on a certain day, it requires no real thought at all. Just be sure to have them scan at a time when you don't need your computer for important work.

What you need to remember to keep a healthy registry:

* Confirm through a registry scanner that Windows registry problems are occurring on your computer.

* Check out the registry repair systems and download one to repair your registry.

* Run registry scans regularly to fix problems as they occur.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=916915


A small history of Microsoft Exchange 2007

Microsoft Exchange 2007 is collaborative software meant to aid people who are involved in the same tasks achieve their goals. But how did it all start? What and when were the bases for this software set and what improvements have been made over the years?

The migration from the 'legacy XENIX-based messaging system' to Exchange began in 1993 and by 1995 500 users were already using the first version. By April 1996 there were already 32000 users who migrated to this environment.

Exchange Server 4.0 was introduced to the public in June 1996 as an upgrade for the Microsoft Mail 3.5. In 1997, there were two other versions released, Exchange 5.0 and 5.5. The dawn of the new millennium marked the release of the Exchange Server 2000, which overcame a large number of the limitations the previous version suffered from, but it also had some difficulties when it came to upgrade.

The predecessor of Microsoft Exchange 2007, the 2003 version, eradicated the upgrade difficulties, allowing the users to slowly migrate to the new environment without any prolonged downtimes or unnecessary expenses. It also had a number of improvements when compared to the previous versions, like new applications and better data recovery options.

Until the Microsoft Exchange 2007 was released, there were a lot of people that were uncertain as far as the future of the product goes. This was due to the fact that the release of the add-on Edge Services was dropped, even though it was announced in 2005.

The main purpose for the Exchange software is electronic mail, tasks and contacts, calendaring and other similar activities. The release of the Microsoft Exchange 2007 marked the integration of voice mails, better filtering, and a new interface of Outlook Web Access.

Microsoft Exchange 2007 support platforms are more 'new age' to point out the substantial improvements that this type of hardware can bring to this product. It runs only on 64-bit versions of Windows Server, which means that if you use 32-bit software and hardware, both will need to be replaced.

Microsoft Exchange 2007 support for clusters has been dramatically improved. As opposed to its predecessor, the 2003 version, which allowed the active-active node clustering, Exchange 2007 support for this type of clustering has been removed due to low performance.

You may be able to find a service provider that hosts this software and provide Exchange 2007 support round the clock. You can also search the internet for more information regarding whatever it is you want to know about this product.

In order to keep you from spending hours on the web trying to find the information that you want, you should go directly to the target and visit the website hostedex.com. This is the place where all your questions regarding Microsoft Exchange 2007 will be answered.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=913189


The NVidia 9900GTX Analyzed

According to rumors going around in the Computer INternet community, the next GPU's being released by NVidia will be the 9900 GTX and the 9900GX2. 

Now, nVidia just released the 9800GX2 and 9800GTX GPUs a few weeks before. So why would they keep releasing more and more new ones? Well, nVidia is releasing the 9900 series to replace the 9800 series. The 9900GTX will replace the 9800GTX, and the 9900GX2 will replace the 9800GX2. But again, why would they do this? 

Frankly, I don't really know what the heck they're thinking. Doesn't that sound familiar though? That's the same thing they did for the 9800GTX; it was released to replace the former 8800GTX. So, why would they do the same thing twice? Well, a rather simple idea is that nVidia is stalling for more time. More time for what? To develop new chips of course! 

Have you noticed that ever since the debut of the G92 chip in the 8800GT that they've been stalled in development? Here, look at their GPU's: the 8800GT, the 8800GTS G92, the 9600GT(G94, which is a close sibling of the G92), the 9800GX2(just two G92 chips put together on a single card) and the 9800GTX(again, the G92 chip). They've been using the same chip for many months now. In my opinion, now would be the time for something new, something just so original that it just beats every other GPU. 

If they keep this up, they may give ATi enough time to catch up on them in market shares, and maybe beat nVidia! Then no more part-monopoly!! Who knows what will happen. But this will be a very interesting battle.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=902658



If you don't work with graphics on a regular basis, you may not know which image format to use for your web projects. It can be a bit confusing. If you're a beginner, take heart, I've seen even experienced webmasters use the wrong formats.
To help you get a better understanding of what to select and when, lets look at some common image formats available.

The BMP Format

The BMP format is Windows' default format which is often referred to as a (bitmap).

For many reasons, I've always felt that Windows' is it's own worst enemy. This couldn't be more true when it comes to it's default image format. BMP images are very large in file size. There's not much you can do to remedy that. What you see, is what you get.

Consequently, this format is not a good candidate for web use. The browser simply can't load a BMP file in an acceptable amount of time. If your favorite photo is in BMP format and you want to add it to your website, use one of the many graphics programs available to convert it to a JPG format first.

The JPG Format

The JPG format was designed and named by the Joint Photographic Experts Group. Technically speaking, it's not an image format. It's an image-compression standard.

It is the most common format for use with photographs. The good and the bad with the JPG format is it's "lossy compression" feature.

In simple terms, "lossy compression" is a method of reducing image file size by throwing away unneeded data, causing a slight loss of image quality. Most image editing programs, like Photoshop allow you to choose how much compression you want.

Another term for this compression process is "optimization." When I deliver optimized JPG images to my clients, I generally compress them to about 60%. I've found this to be the magic number that reduces the file to a reasonable size yet doesn't compromise the quality of the image.

Optimizing further leaves blotches and speckled blocks of color. The colors look washed-out and a lot of fine detail is lost.

One of the biggest advantages I find in using the JPG format is that I can include enhancements like reflections and drop shadows which provide depth to an image when making 3d graphics. Almost all other formats look horrible with these enhancements.

I say almost because their is one other format that displays these enhancements very well, but we'll cover that later.

Another thing to note here is that saving and RE-saving a jpg file will cause loss of quality as well. Every time you RE-save it you loose a bit more.

The GIF Format

The GIF format was the first image format used on the web, invented by Compuserve in 1987.

The worst part about using a GIF format is it's restrictive nature. This format will only display a maximum of 256 colors. It's great for clipart, diagrams and charts. Basically graphics that only have a small number of colors in them. But it does absolutely horrible things to photographs.

It's also one of the few formats that you can use to make a transparent background and the only format I'm aware of used to make animations.

The PNG Format

The PNG format is the new kid on the block. It's monogram stands for Portable Network Graphics. Somebody got tired of putting up with the outdated GIF file and designed the PNG as a replacement for it. What a blessing that was!

There are two levels of quality to the PNG compression. PNG- 8 is pretty much like a GIF. It uses a smaller number of colors and the quality is much the same.

The PNG-24, however, has what is known as "alpha transparency." Unlike it's GIF counterpart and the PNG-8, it sports the ability to blend in perfectly as a transparent image, even when enhancements like reflections and drop shadows are added.

The PNG-24 allows the use of millions of colors verses the restrictive GIF format with only 256. This makes the PNG format a terrific alternative to a GIF for making photographs and other multi-colored images with a transparent background. It's like having a GIF and JPG all rolled up in one.

Granted, the file sizes are larger than a JPG, but with todays connection speeds and the ability to slice images into pieces, file sizes are less of a problem than they use to be just a few years ago.

Unlike the JPG format that loses quality each time it's saved, the PNG compression is lossLESS. This means, you can save it once or a hundred times and it won't lose it's quality.

As of this writing, the only downside to using a PNG-8 or PNG-24 format is it's inability to render properly, if at all, in the IE5.5+ browser.

The good news is, there is a work around for this which solves the problem until Microsoft addresses this issue. Thanks to Angus Turnbull for making this available. Search for his name in Google to find his website.

To sum it all up, select the GIF or PNG-8 format for transparency, the GIF format for animations and the JPG or the PNG-24 for images that require top quality rendering.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=902714


Free Registry Fix - Computer Maintenance, All For Free

Windows registry is a set of database to store all information of software and hardware files of the computer system. The system configuration and its related files are all stored in the computer. So it is important for the user to keep the files intact and safe and use tools for free registry fix. Any deletion or modification of these files can corrupt the windows operating system or cause errors in the files loaded in the system. All windows based operating systems have registry files, which are hidden inside the hard disk of the system.
See How Important The Registry Is

These registry files contain information on setting and option of Windows which are based on 32 bit version and Windows 64 bit version and includes Windows 95, 98, ME and NT. The database of all files is important from the point of view of the performance of the computer system settings. A small change in the registry can result in crash of the computer system. Any change made to the software or hardware gets highlighted in the registry files and it then becomes extremely difficult to fix windows registry. Even the option of free registry fix does not help. Any change made to control panel settings, or system policies, or installed software, or any file association is automatically saved and stored in the registry of the computer.

Save Originals Before Fixing The Problem

Plenty of times users themselves try to use registry fix free offers that can be seen online. In many cases, they don’t possess adequate knowledge. In a try to fix, problem and enhance the computer speed, they unknowingly delete some important files which are then not easily recoverable. So the users are advised to save the registry and thereafter make any changes on it. The registry files are stored in different location in the hard disk. Storage of registry files differs in different versions of window registry. A registry file is saved in a directory and it has two hidden files under Windows 95 and Windows 98 versions. So, no matter you are using free registry fix to sort out the problem, get all related necessary information.

Now days, a user can purchase the software to remove malicious files, virus or fix registry errors on the system. These softwares are easily available for that can be tried online before users buy them. Apart from this, free registry fix program software is also available that a user can use to fix registry files without paying even a penny. Despite not knowing fully, users try to access registry files and this causes havoc. So, better stay away from limited information and gather all knowledge before you actually start working on the problem.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=902940


Microsoft Office Training Tips

Learning Microsoft Office proves to be a challenge to many people. Some people try opening up Microsoft Office to navigate through the software, without actually knowing how to use it. Nothing much is actually attained with this as it is only those who are actually software savvy who will know how to use the program this way. For others, this is a complete waste of time as they need Microsoft Office training to learn to use the software. There are different Microsoft Office training methods for you to choose from to learn Microsoft Office. Some options are Microsoft Office training disks, Microsoft Office training software and online Microsoft Office training. It is the Microsoft Certified Partners for Learning Solutions that provides both online and classroom training. This option is favored by IT developers and professionals where its curriculum is used by the Microsoft Certified Partners for Learning Solutions.
Microsoft E-Learning Library

IT professionals also like the Microsoft E-Learning Library where they learn all about Microsoft certification through browser based training. With the help of a new or existing Volume Licensing Agreement, you are provided access to the e-learning library. IT professionals also find many books helpful for them to for studying and training for their certification exams. There are Microsoft Office training CDs offered with some books for better understanding of Microsoft Office. And if you have Software Assurance with you, you will also be able to receive free products from Microsoft.

Computer Based Microsoft Office Training Options

The Microsoft Office training CD is an excellent option for not only those wanting to learn Microsoft Office, but also for brushing up of a few areas by those people who are already familiar with the program. There is nothing much to be done with this tutorial; all you have to do is to sit in front of a computer and go through the tutorials. You can focus on areas you need help with or go through the entire CD, part by part. There are numerous tutorials and modules that prove to be helpful to people involved in eLearning where they gain access to lots of information on a variety of topics. Microsoft Office training is offered with Core and Advances Training, along with other desktop applications including SQL Server, Windows Server, Exchange Server and Core Training for Windows. There are also training vouchers allotted to clients for their licenses with Windows and Microsoft Office. With these vouchers, you are provided a day of free training if you are covered with Software Assurance coverage. When venturing into Microsoft Office training, it is important that you focus on learning Microsoft Office. If your mind is diverted and focused on other things, then you will never actually learn Microsoft Office. Time allotment is also important for Microsoft Office training. It is not possible to learn an hours worth of training in 15 minutes; so allot your time accordingly so that you will be able to learn Microsoft Office as soon as possible.

With the wide and varied Microsoft Office training options you have, you are sure to find an option that suits you the most. No matter what your time table is like or how busy your day may be, there is sure to be a Microsoft Office training package that suits you.

taken from : http://www.goarticles.com/cgi-bin/showa.cgi?C=903149


Advantages And Disadvantages

Bluetooth has a lot to offer with an increasingly difficult market place. Bluetooth helps to bring with it the promise of freedom from the cables and simplicity in networking that has yet to be matched by LAN (Local Area Network).

In the key marketplace, of wireless and handheld devices, the closest competitor to Bluetooth is infrared. Infrared holds many key features, although the line of sight it provides doesn't go through walls or through obstacles like that of the Bluetooth technology.

Unlike infrared, Bluetooth isn't a line of sight and it provides ranges of up to 100 meters. Bluetooth is also low power and low processing with an overhead protocol. What this means, is that it's ideal for integration into small battery powered devices. To put it short, the applications with Bluetooth are virtually endless.

Disadvantages Bluetooth has several positive features and one would be extremely hard pressed to find downsides when given the current competition. The only real downsides are the data rate and security. Infrared can have data rates of up to 4 MBps, which provides very fast rates for data transfer, while Bluetooth only offers 1 MBps.

For this very reason, infrared has yet to be dispensed with completely and is considered by many to be the complimentary technology to that of Bluetooth. Infrared has inherent security due to its line of sight.

The greater range and radio frequency (RF) of Bluetooth make it much more open to interception and attack. For this reason, security is a very key aspect to the Bluetooth specification.

Although there are very few disadvantages, Bluetooth still remains the best for short range wireless technology. Those who have tried it love it, and they know for a fact that Bluetooth will be around for years to come.

taken from : http://www.goarticles.com/cgi-bin/search.cgi?c=7&title=Computers


Copyright © 2008 - Aziz Personal Blog - is proudly powered by Blogger
Smashing Magazine - Design Disease - Blog and Web - Dilectio Blogger Template - cisthouse Friendster