On October 29 2014 10:11 Blisse wrote:
Do you think the future of coding will be 5K to 8K monitors? :3
Do you think the future of coding will be 5K to 8K monitors? :3
Hmm. No? :D
Forum Index > General Forum |
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. | ||
ZenithM
France15952 Posts
October 29 2014 01:14 GMT
#10761
On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 Hmm. No? :D | ||
Mindcrime
United States6899 Posts
October 29 2014 01:35 GMT
#10762
but pixels? as long as text is easily readable, who gives a shit. | ||
Manit0u
Poland17046 Posts
October 29 2014 03:44 GMT
#10763
Besides that, I'm perfectly fine working on a 80/20 terminal window. Stuff you get used to over the years... That's also why it's recommended in most languages to not have lines longer than 80 chars (should be 75 though, to account for extra space needed to show line number if you have this option turned on) and methods no longer than 20 lines, so you can view it all at once while working through ssh on a remote server. | ||
ZenithM
France15952 Posts
October 29 2014 03:56 GMT
#10764
In fact today some coding conventions advise 120 chars per line because let's be honest, in some languages 80 chars is not much. It's obvious that having a lot of screen space is good (big monitors, dual/triple monitor setups...). Increasing pixel density however, as it's done currently, is the thing that is completely useless for coding. We need to be able to read the font but we really don't need it to be super smooth and flawless and retina and whatnot. So I don't really believe you would have much use for a 8K monitor. It's only useful if you have a big screen. Coincidentally enough, you get the same effect by putting 3 monitors next to each other (3 monitors = 6K screen, so cool!!) | ||
_fool
Netherlands663 Posts
October 29 2014 07:15 GMT
#10765
Having people modify code on remote servers sounds like there is no DTAP protocol, and maybe no version control either? Sounds brave but risky, and I'm curious what type of systems require this approach. (Don't get me wrong, I love my terminals as much as the next guy. But mostly for config changes, log analysis etcetera) | ||
CatNzHat
United States1599 Posts
October 29 2014 07:48 GMT
#10766
On October 29 2014 16:15 _fool wrote: Question to the people that mentioned using a terminal and ssh to modify code on a remote machine: what type of software product (as in functionally: what does it do) are you working on? Having people modify code on remote servers sounds like there is no DTAP protocol, and maybe no version control either? Sounds brave but risky, and I'm curious what type of systems require this approach. (Don't get me wrong, I love my terminals as much as the next guy. But mostly for config changes, log analysis etcetera) I frequently dick around with code on remote staging instances with vi, but that's just to verify things are working as expected because my local environment has some differences to production that the prod-like staging instances do not. I can see people working on various system/infrastructure tools/services in small-medium sized companies. Things like tweaking some scripts on an asset server to optimize CDN caching TTLs might be done directly on the production box without version control if there is only one engineer maintaining and the code just needs to get the job done. | ||
Manit0u
Poland17046 Posts
October 29 2014 08:00 GMT
#10767
| ||
meatpudding
Australia520 Posts
October 29 2014 09:45 GMT
#10768
On October 29 2014 16:15 _fool wrote: Question to the people that mentioned using a terminal and ssh to modify code on a remote machine: what type of software product (as in functionally: what does it do) are you working on? Having people modify code on remote servers sounds like there is no DTAP protocol, and maybe no version control either? Sounds brave but risky, and I'm curious what type of systems require this approach. (Don't get me wrong, I love my terminals as much as the next guy. But mostly for config changes, log analysis etcetera) Had to work on some sims where all the code is compiled and executed on a remote cluster. This is for a research project. The whole time I was in vi with ssh. Could have done it locally using a visual IDE and scp over, but it was ultimately easier to do remotely. There was no source control used, although not much would have changed if it were. | ||
nunez
Norway4003 Posts
October 29 2014 11:39 GMT
#10769
On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 haha, no, i hope that as we hone our skills and tools, languages and code will become more expressive, closer to mathematics, further from prose, and we will need less workspaces. on the other hand i wouldn't mind another monitor or two! | ||
Deleted User 101379
4849 Posts
October 29 2014 13:02 GMT
#10770
On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 Programming probably benefits a lot more from additional monitors compared to bigger or higher resolution monitors. The physical separation really helps a lot, e.g. when making changes and watching them on the second monitor or when having the documentation *cough*or stackoverflow*cough* on one screen and the code you are writing on the other. I'd say that any programmer working with just one monitor, whether it's 1k, 2k, 4k, 8k or 128k is hurting his own productivity (and enjoyment). On October 29 2014 16:15 _fool wrote: Question to the people that mentioned using a terminal and ssh to modify code on a remote machine: what type of software product (as in functionally: what does it do) are you working on? Having people modify code on remote servers sounds like there is no DTAP protocol, and maybe no version control either? Sounds brave but risky, and I'm curious what type of systems require this approach. (Don't get me wrong, I love my terminals as much as the next guy. But mostly for config changes, log analysis etcetera) Taking web development as a common example, it's often easier to debug a website "live" than to try to replicate the problem in the often quite different production environment. While ideally it shouldn't happen, software development environments are often far from ideal due to time and budget constraints, legacy code and such preventing developers from creating the development environment they'd like. It's not pretty, noone wants to do it, but in the end it's the reality of the job. In the company I work for, I was promised that we'd use GIT and automated deployment "soon" when I joined. That was about 15 month ago and we still use CVS and upload files by hand, debug on live, etc. Our staging system stopped working about a year ago, so deploy is straight from production to live and gets tested on the live server. Every developer here wants to change it, but that would take 4 weeks of halted development and the product owners simply have other priorities. | ||
ZenithM
France15952 Posts
October 29 2014 16:07 GMT
#10771
On October 29 2014 20:39 nunez wrote: Show nested quote + On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 haha, no, i hope that as we hone our skills and tools, languages and code will become more expressive, closer to mathematics, further from prose, and we will need less workspaces. on the other hand i wouldn't mind another monitor or two! It's likely the converse that will happen though ;D. Programming languages will become closer to prose and natural languages and further away from formal languages. Some could actually tend toward being even more formal, sure, but those won't be the ones that everybody will use. | ||
LaNague
Germany9118 Posts
October 29 2014 16:14 GMT
#10772
But most workplaces will use other languages where you can be more productive with. | ||
bangsholt
Denmark138 Posts
October 29 2014 16:27 GMT
#10773
On October 30 2014 01:14 LaNague wrote: you are free to write in assembler if you dont like prose. But most workplaces will use other languages where you can be more productive with. Fucking scrub. Real men use a magnetized needle and a steady hand.[1] | ||
nunez
Norway4003 Posts
October 29 2014 16:47 GMT
#10774
you write mathematics in assembly? hardcore bro. i use mathematical notation myself, but i'm a pleb. | ||
Deleted User 101379
4849 Posts
October 29 2014 16:48 GMT
#10775
On October 30 2014 01:14 LaNague wrote: you are free to write in assembler if you dont like prose. But most workplaces will use other languages where you can be more productive with. The reason why languages won't become much closer to prose is because spoken language has a huge amount of ambiguities, which simply don't work in programming languages. A programming language has to be precise, so while you can change "{" to "begin" or indent several lines, it's still simply a different syntax for a very precise concept. If you just want words instead of symbols, there are C preprocessor macros that can do it for you.
I'd say that functional languages are the closest to spoken language because they work directly with concepts instead of abstraction, e.g. "let x be y and do z with it", even though they have a lot more "syntax" than e.g. python. However, even they are bound by the precision required to turn it into something the computer can work with. While computers can "understand" spoken language, i.e. prose, it's not usable for the actual programming. | ||
Blisse
Canada3710 Posts
October 29 2014 18:07 GMT
#10776
On October 29 2014 22:02 Morfildur wrote: Show nested quote + On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 Programming probably benefits a lot more from additional monitors compared to bigger or higher resolution monitors. The physical separation really helps a lot, e.g. when making changes and watching them on the second monitor or when having the documentation *cough*or stackoverflow*cough* on one screen and the code you are writing on the other. I'd say that any programmer working with just one monitor, whether it's 1k, 2k, 4k, 8k or 128k is hurting his own productivity (and enjoyment). I just don't understand why you guys think that using 4K means you'll see less. Using the shitty 4K 39" Seiko or good 32" Dell, 4K is still 4x more space than 1 1080p screen, so even with 3 1080p screens you're still only approaching one 4K screen. You can have more than one browser or code editor open at the same time on one 4K screen so I don't understand the argument against it. | ||
Manit0u
Poland17046 Posts
October 29 2014 19:35 GMT
#10777
On October 30 2014 03:07 Blisse wrote: Show nested quote + On October 29 2014 22:02 Morfildur wrote: On October 29 2014 10:11 Blisse wrote: Do you think the future of coding will be 5K to 8K monitors? :3 Programming probably benefits a lot more from additional monitors compared to bigger or higher resolution monitors. The physical separation really helps a lot, e.g. when making changes and watching them on the second monitor or when having the documentation *cough*or stackoverflow*cough* on one screen and the code you are writing on the other. I'd say that any programmer working with just one monitor, whether it's 1k, 2k, 4k, 8k or 128k is hurting his own productivity (and enjoyment). I just don't understand why you guys think that using 4K means you'll see less. Using the shitty 4K 39" Seiko or good 32" Dell, 4K is still 4x more space than 1 1080p screen, so even with 3 1080p screens you're still only approaching one 4K screen. You can have more than one browser or code editor open at the same time on one 4K screen so I don't understand the argument against it. Dude, you might have 4x the pixel count but you only get 7'' of actual space. Dual-monitors (from your example) give you 25'' of extra space to fit stuff in. The text can get hard to read even on 1080 and it usually requires upping the font sizes, I don't even want to think about 4k. | ||
nunez
Norway4003 Posts
October 29 2014 19:42 GMT
#10778
| ||
Quixotic_tv
Germany130 Posts
October 29 2014 19:50 GMT
#10779
I am making a company internal iOS App that establishes a connection to an Arduino via Wifi. The Shield I am using is sending via UDP. Just want to know if some people here are doing something similar. | ||
nunez
Norway4003 Posts
October 29 2014 23:46 GMT
#10780
lets user define abstractions over aribitrary sets of move constructible types, with no dependency from element type to abstraction type, thus avoiding the inherent absurdity (abstraction before element? that's silly!) of your regular inheritance. unless the abstraction is used recursively (ghasp, that also sounds absurd), then you have to forward declare the element type, but no circular dependency since the element value is held by pointer. also provides transparent copy and move semantics (when you assign an abstraction to another the held elements assignement operator is used, ie you gotta make sure the lhs and rhs are holding elements of same (really ought to be assign compatible) types. however copy (both assignement and construction) and move assignement are not an element type requirement unless you use it in your code (otherwise the functions calling them are instantiated). also allows simultaneously deconstructing an arbitrary number (at the mercy of gcc) of these abstractions, in arbitrary contexts. this lets you induce polymorphism in arbitrary subsets of the direct product of the type sets belonging to the abstractions being deconstructed. phew... to use a simpler specific case, when you do a unary deconstruction, you can induce different behaviour for each subset in an arbitrary exact cover of the type set of the abstraction you are deconstructing. the simplest examples of which are that all element types get mapped to different function overloads, and the opposite: all element types get mapped to functions instantiated from the same function template. with this approach abstractions and type specific behaviour are completely separated from the type definition. instead of grouping functions typewise, this lets you group types functionwise. that's possibly neat! example: #include<iostream> output: [jeh@gimli work]$ make | ||
| ||
StarCraft 2 StarCraft: Brood War Dota 2 Heroes of the Storm Other Games Organizations Other Games StarCraft 2 StarCraft: Brood War StarCraft 2 StarCraft: Brood War |
The PiG Daily
Clem vs TBD
Reynor vs TBD
Dark vs ReBellioN
herO vs TBD
ESL Open Cup
ESL Open Cup
ESL Open Cup
GSL Code S
ESL Pro Tour
ESL Pro Tour
ESL Pro Tour
ESL Pro Tour
Online Event
[ Show More ] ESL Pro Tour
Hatchery Cup
BSL
ESL Pro Tour
Sparkling Tuna Cup
ESL Pro Tour
BSL
ESL Pro Tour
|
|