|
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. |
On January 29 2015 04:04 darkness wrote:Show nested quote +On January 29 2015 04:03 bangsholt wrote:On January 29 2015 03:56 darkness wrote:On January 29 2015 02:18 Manit0u wrote:On January 29 2015 02:08 bangsholt wrote:And here all I do is just C, C# and the occasional Java when it's unavoidable. I seem to be missing out on so much You're not. C++ is scrapcode. Nothing like glorious C or elegant C# If you want truly elegant metaprogramming you need to look no further than Lisp, which is ahead of any other programming language since 1958 I disagree. C++ is really cool but some people do their best to make their code unreadable. "Glorious C" - good luck with that unsafe language. :D I'm interpolating a bit here, but are you trying to say that C is unsafe and C++ is... safe? Well, I've recently started learning C++ and I can say it looks safer so far. Provided that you use the right tools, of course. Edit: C's macros immediately come to mind. I'm also not saying C++ is completely safe, it obviously has a lot of undefined behaviour (e.g. dangling raw pointers) but I think it's safer than C.
Define "safe". The biggest plague of C++ is that it lets people who don't know any better do horrible things (mostly unknowingly), like overloading a comma operator or something. Not a big issue if you're working alone on some project, but if you have many contributors of vastly varying skill level you have the skilled guys painstakingly hunting down and fixing errors done by the less skilled people, which are much harder to track in the entire class hierarchy than a piece of C code. Also, STL and Boost, which provide you with nice features aren't the most stable, portable or optimized (even simple stuff like string concatenation can easily get out of hand).
Also relevant:
*hidden complexity* (1) it's hard to say what code will compile down to. viz., constructors can be elided, but there is no fitness warranty; profiling your compiler to find out whether it is elided is tedious and "searching for secrets" that should be _explicit_ (2) people don't understand static polymorphism and compile-time dispatch; people are used to objects sending messages dynamically (run-time dispatch) (3) coercion (4) networks of objects are not explicitly laid out, hiding quadratically complex patterns of communication between objects (5) data structure and data flow come before algorithms. Sometimes, data structure dictates data flow (ad-hoc networks of objects); sometimes, data flow dictates data structure (one of life's most disagreeable tasks - waiting in line - is characterized as FIFO). This, I feel, is the most important point, because the first rule of programming is to figure out what you want to say before you figure out how to say it. In C++, ad-hoc networks of objects with cyclic message paths are all too easy to create [see (4)] which means _code_ _is_ _not_ _explicit_ and as a result _code_ _is_ _not_ _fast_.
*transfer semantics on objects are not robust* this ties into (1) in hidden complexity the code author needs to specify a lot of boilerplate to achieve desired transfer semantics on objects. Similarly, the code audience, be it reviewer, maintainer or merger, needs to read a lot of boilerplate to understand how objects get moved around in memory. Moreover, most of these concepts are intuitively declarative in nature, such as a parent object/child object relation.
*poor re-use of effort* "code re-use" is a misnomer; when programmers speak of code-reuse they mean re-use of effort. There is no benefit to polymorphism if effort cannot be consolidated easily.
*C++ Standard iffy* Some things just disappear quickly for *frantic* reasons (strstream was removed for aesthetics), indicating not enough foresight into what is important. I do not want to pick a language where I have to worry about features in it's "standard library" becoming deprecated mainly for aesthetics. As Dijkstra preached, programming is _not_ supposed to be a frantic exercise.
-- Z-Bo
[...] The problem with C++ is that every C++ developer has his own style, and reuse is an illusion within that style. Take a look at classes implementing matrix arithmetic: there are as many around as the day is long, and all of them are incompatible with one another.
With regard to programming styles, C++ does not support multiple inheritance. For a single project grown from a single start, you can get reasonable solutions. But combining stuff is creating maintenance messes.
With C, the situation is not dissimilar, but you spent less time fighting the illusion that you don't need to reimplement, anyway.
The difference is that you can pass structures from one library into another with tolerable efficiency. Because there are only basically 2 ways to lay out a two-dimensional array of floats. [...]
Care to explain why there are still not two numerical C++ libraries with compatible matrix classes?
What use is talking about portability and high level when a basic interoperability feature that has been available since the sixties (more than 4 decades ago) in Fortran has not yet managed to make it into C++? C++ by now more or less offers a (somewhat deficient) standardized way to work with complex numbers, but matrices are still not standardized in any manner, and libraries won't interoperate.
So C++ should get its head wrapped around the _low_ level problems first. It is a bloody shame that it still has not caught up with Fortran IV (or even Fortran II) with regard to usefulness for numerical libraries.
It is not a matter of "hating high level" to see that C++ is mostly focused about addressing the wrong kinds of problems in the wrong ways. The pain/gain ratio is just bad. [...]
Making a language huge and bloated in order to be able to use the language itself for defining a set of basic data types is just masturbation. C++ has the most complicated set of implicit conversions from any language in the world, and what for? It is modeled for being able to create a user-defined "complex" type which behaves almost as well as Fortran's. Too bad that this mostly means everybody will define his own type (well, at least we have seen two or three different library "standards" by now), and that the implicit conversion rules and chains are appallingly wrong for a number of other possible user-defined arithmetic types.
-- David Kastrup
|
Question for you guys coming from someone that does mainly webdev and .NET stuff.
I'm looking to create some mobile app (ideally and eventually for both the Android and iOS markets).
Is my understanding that for Android you use Java and for iOS either swift or objective-c correct?
If I have a choice between learning swift versus objective-c, which would you say is the better path?
And any generally recommendations for a newbie wanting to learn mobile app development would be pretty sweet!
|
Even with a Computer Science degree, I cannot appreciate Dijkstra's programming views, so I can only sympathise with ill souls who believe and quote him. That said, he is right about goto's but that's it. Other than that, I'm not going into language religious wars. I'm happy with the programming languages I use at work, and you may continue to use whatever language you use. Just note that "varying levels" of programming skills applies to all languages not just C++ or any particular.
|
Swift because Objective-C is going to be outdated. Unless you need to release in the next 3 months, no point in learning that mess.
|
@manitou your ignorance will always reveal itself when you attempt to give insights into a language you know little about.
try to keep it constructive, and focus on the languages you know and like.
|
On January 29 2015 05:50 BlueRoyaL wrote: Question for you guys coming from someone that does mainly webdev and .NET stuff.
I'm looking to create some mobile app (ideally and eventually for both the Android and iOS markets).
Is my understanding that for Android you use Java and for iOS either swift or objective-c correct?
If I have a choice between learning swift versus objective-c, which would you say is the better path?
And any generally recommendations for a newbie wanting to learn mobile app development would be pretty sweet!
Xamarin. Costs money but saves effort in the sense that it's write the core once, port to Windows Mobile/Android/iOS. If you look into Xamarin.Forms you can even make the interfaces look cool with the same pieces of code.
Best of all Xamarin is C#.
|
On January 29 2015 06:05 darkness wrote: Even with a Computer Science degree, I cannot appreciate Dijkstra's programming views, so I can only sympathise with ill souls who believe and quote him. That said, he is right about goto's but that's it. <snip generic stuff about languages>
You disagree with his programming views?
I mean, as an example, the goto paper is about using structured programming instead of using goto as control structures, to make code easier to write and to debug because you won't be jumping all over the place ^_^
Or do you dislike his algorithms? I really don't follow what you mean :D
|
On January 29 2015 06:38 bangsholt wrote:Show nested quote +On January 29 2015 06:05 darkness wrote: Even with a Computer Science degree, I cannot appreciate Dijkstra's programming views, so I can only sympathise with ill souls who believe and quote him. That said, he is right about goto's but that's it. <snip generic stuff about languages> You disagree with his programming views? I mean, as an example, the goto paper is about using structured programming instead of using goto as control structures, to make code easier to write and to debug because you won't be jumping all over the place ^_^ Or do you dislike his algorithms? I really don't follow what you mean :D
His algorithms aren't exactly programming related, so I didn't mean that. However, his hatred towards OOP is a prime example why I don't agree with his programming views. I also said I agree with goto's criticism.
Edit:
Object-oriented programming is an exceptionally bad idea which could only have originated in California. -Edsger Dijkstra
|
On January 29 2015 06:26 nunez wrote: @manitou your ignorance will always reveal itself when you attempt to give insights into a language you know little about.
try to keep it constructive, and focus on the languages you know and like.
All in good faith I never was much into Language Wars so don't take my views on it very seriously. Also, I quoted some stuff I've found interesting (without the over-used Torvalds' C++ bashing) in hopes that perhaps someone can clarify that or post counter-arguments.
|
Object-oriented programming is an exceptionally bad idea which could only have originated in California. -Edsger Dijkstra
I agree with that statement.
|
On January 29 2015 06:15 Blisse wrote: Swift because Objective-C is going to be outdated. Unless you need to release in the next 3 months, no point in learning that mess.
That is simply not true. Objective-C will stay and be the main platform of iOS development for a long long time. Swift however is not even 'complete' yet - it has plenty of deficiencies and incomplete features.
|
if you think you are man enough to make a meaningful critique, go ahead and make it, instead of spamming vacuous nonsense.
however if you are having problems coming to terms with operator overloading, then i suggest you leave it be.
|
|
An example of a good non-programming book I personally recommend would be Coders at Work. It's interviews with known programmers (Joshua Bloch, Knuth, Ken Thompson, etc.), a lot of history and anecdotes. Very interesting read.
|
I never can't know if nunez is trying to write poetry or if he's just writing TL posts with emacs.
|
On January 29 2015 13:27 ZenithM wrote: I never can't know if nunez is trying to write poetry or if he's just writing TL posts with emacs. It all makes sense now!
|
|
On January 29 2015 06:44 darkness wrote: Object-oriented programming is an exceptionally bad idea which could only have originated in California. -Edsger Dijkstra
Well, according to Google he never even said that. There is this though:
(For those who have wondered: I don't think object-oriented programming is a structuring paradigm that meets my standards of elegance.)
|
Guys, can you help? Have you encountered such a thing before?
-- nevermind, it's the easiest things that get you (as usual), should have thought about not using the wrappers included in the project by previous programmers and go for the basic thing instead --
|
Yesterday I implemented UDP in Ada in a stupid way(by using streams). What would be the best way to do this?
|
|
|
|