Hello techie
You know, to be honest, my comment in the body of the first post, where I used the word lazy, was the wrong word choice. ...
---
...miss print, so it turns out that the 23” actually runs at 2048 x 1152. At that point I’m thinking; how fricking hi are they going to crank the darn res up. That looks like the max of my graphics card....
---
I understand that developers where not exposed to the higher resolutions with some of the older software , but Today-If MS can display an in dependent DPI setting to enlarge their tool bars, fonts, and dialog boxes, without loosing any quality, or sharpness, why are companies like Adobe, and ESET, (to name a couple) not following the new standards with their latest software. I"m aware that some windows and dialog boxes are not dynamically expandable, But this DPI setting I'm referring seems to be independent of the actual hardware screen rez. I think it's emulating a lower screen resolution, but in a layer kind of manner, without stretching the icons which happens when the appearance tab is used.
On quote one, not to worry I wasnt taking it personally.
I agree with you in a lot here, because I for one think the biggest threat to software development is "Hubris"
In my soon to be 15 years of development related work and PC work in both hardware and software, I have often been met with other developers lack of interest in sharing knowledge about certain routines and practices.
It is as if they feel they are privy to something secret and special solely because they had a different manual or access to other lines of the same fields education.
I for instance have focused on simplfied GUI's towards DBMS interaction, combined with web development, and have hardly had a need to focus much at all on GDI, DirectX and other graphical routines, until now that I get into game and gui development using partly 3D tools.
And here the problems begin...
In your quote 2, you remark on the excellent resolution, but at the same time state "That looks like the max of my graphics card....", which is where the root of a lot of the problem lays.
In the event you are not a game developer, you dont need to write code using the biggest VGA adapters, and I doubt we'll ever see a need to write pure HTML or C++ using a 2 GB VGA Adapater.
Much like when I started to develop applications using Visual Studio, I was on a windows 98 box using FAT32 as filesystem.
This little change of upgrading to NTFS opened the ability for me to compile software for both FAT32 and NTFS. So by changing my os setting I can now compile for other environments.
In other words, if you dont have the hardware at hand its very difficult to write code for it, and assuming all developers have access to all the latest native systems is very difficult at best. Even if you can write the code, does not imply one can even test it properly no matter how closely one follows a defined API.
In the third quote above, you mention the comparison between MS and Adobe, or many other large software houses not following the same principles.
This is simply because Microsoft tends to follow MS software routines, and expand on them for their own use before making it available to the greater public of developers.
The next and even larger issue is the fact that where one works with DirectX 9, another uses OpenGL or other VGA routines in fancy footwork on the pixels.
Both are quite different to work with, just as well as using different language bases, C, C++, VB, C#, Python, Perl, Java, the list goes on...
Even if two applications are written in C++ does not mean their even remotely similar in coding. That also depends on code style choice, compiler and IDE tools and much more. I studied C++ on Borland, and work with VS.
Quite a lot is different in the source code and development environment even if the underlying target is the same, and supposedly the software is to do the exact same thing.