|"It says 1366x768 so does that mean that will be the maximum resolution i can use for my PC?"|
Yep. Anything higher would just be scaled back down to 1366x768 and lose image quality in the process.
"...so I switched to 1366x768 and its not as clear"
LCD displays have a fixed (native) resolution. For your monitor, that resolution is 1680x1050. Going any higher or lower than that would force the monitor to up- or down-sample the image to 1680x1050. This video processing greatly reduces image quality; that's why text looks horrible at 1366x768 on your monitor.
The TV you plan on buying has a native resolution of 1366x768. Text and icons will look razor-sharp at this resolution on the TV.
"Am i also right in thinking for a higher res id need 1080P TV? not 1080i?"
1080p (progressive) and 1080i (interlaced) both display the same number of pixels: 1920x1080, or around two million. On any digital display (LCD, plasma), 1080p and 1080i will look exactly the same. That's because, unlike the old CRT HDTVs, digital sets can only display progressive scan video. If you were to feed a 1080i broadcast into a 1080p LCD/plasma, the set's video processor would first deinterlace the video, then display it as 1080p.
However, since the TV you want cannot display 1920x1080, 1080p/i broadcasts will be downsampled to 1366x768, progressive. Not that there's really a problem with that. Why?
Many HDTV broadcasts--including the 1080 stuff--are compressed to ~11-15 Mbps with the old MPEG-2 codec. MPEG-2 at 1080 pretty much sucks at such a low bit rate--720 at the same bit rate would look just as good (if not better). Unless you plan to watch Blu-Ray discs, there's not much point in getting 1080p on a set smaller than 50". Blu-Ray streams video at a much higher bit rate--up to 40 Mbps--and uses the much more efficient MPEG-4 AVC and VC-1 codecs. That's where 1080 really shines.
Dual-core Opteron 185 @ 3.2GHz
2x 8800GTS in SLI
4GB CL2 PC3200
X-Fi Ti Pro PCI-E
Samsung 24" LCD