OK. Time to escalate this nonsense.
Cleanups to make the code easier to read and more maintainable.
These are remnants of the previous series.
I didn't say I wasn't going to do it, just that it was getting very tedious.
To be clear, the reason why many of the later patches became obviously desirable is because previous patches removed or added code. If I revert the order it's not so clear they are desirable, and that's why they should be applied later.
This is precisely what happened with the table array to single int patch. You didn't see the value when it was before the binary time algorithm, only when it was made afterwards. That's why it should have made afterwards.
Even if you now see the value, even when the patch is before, other readers might not, and that's what should be maximized.
This should not be done after adding the new mode, but before.
Should? That's debatable.
So please move it to !93 (merged).
This is starting to feel like a job.
Also the commit title is strange: "consolidate algorithms". It doesn't consolidate anything, does it? It's just refactoring?
: to join together into one whole : UNITE
Seems to map precisely to what the patch is doing.
Couldn't it be simply integrated to "rework algorithms"?
Yes it could. Just like I could squash all 10 commits into a single one.
But it wouldn't be ideal, because now the reworking of the algorithms wouldn't be clear.
I prefer shorter names. There's no need to burden programmers with longer names if there will never be confusion. In fact, I don't see the need in prefixing internal static functions where algo_time
is more than sufficient.
Plus, the code is so simple, a single algo
function could do the job, or just copy the code into xfce_clock_binary_draw
, which could easily be called draw
.
In fact, I just gave that a try and the result is again much simpler.
My eyes!
All right. I don't particularly care much. I still think many users will be confused, but if they don't want the default then can simply try all the options until they find what they want.
The problem is how they will be presented to the user. The "main" mode is technically "binary-coded decimal", we could use "decimal" for short, but that doesn't tell much to most users. When a normal users thinks of a binary clock, they are probably thinking of a binary-coded decimal clock, which are the most typical ones, that's why I went for "main", although I thought "typical", and "normal" too.
We could internally call it MODE_DECIMAL
even though it's presented as "Main" to the user, but doesn't seem convenient. If a user tries to look at the code he/she might get confused.
If you google for online binary clocks most of them will present the "main" mode right away, like this one timeity.com, no options, no nothing. Some will have the option to enable the "true" binary clock, like this one sasaya.me, but still present the "main" mode by default.
I think "binary-coded decimal" and "binary-coded sexagesimal" are more correct, but probably no user would understand.
I disagree.
This is a code change:
- gint row, col;
- static gint binary_table[] = { 32, 16, 8, 4, 2, 1 };
- gint ticks;
+ gint row, col, ticks;
+ guint n, p = seconds ? 10000 : 100;
And this is a cleanup change:
- gint row, col, ticks;
- guint n, p = seconds ? 10000 : 100;
+ gint row, col, ticks;
+ guint n, p = seconds ? 10000 : 100;
And I know it's a cleanup change because if I do git diff -w
I see nothing.
And this is doing both:
- gint row, col;
- static gint binary_table[] = { 32, 16, 8, 4, 2, 1 };
- gint ticks;
+ gint row, col, ticks;
+ guint n, p = seconds ? 10000 : 100;
This is one of the reasons I don't like this style: it requires constant unnecessary realignments. The Linux style is much better gint row, col;
: no realignment necessary.
Anyway, if you want to see the two changes together I've cherry-picked them and rebased yet again.
"Python is a crappy language that is only popular because it is popular" is a bold statement.
A statement backed by evidence that I'd be happy to share with you. This is my belief, and you cannot tell me not to have that belief.
I could say the same for Ruby
Yes, you could say that, and you would be objectively wrong.
The Python language is objectively and empirically bad and I can prove it to you showing dozens of examples. Pick any venue you want to debate this and I'll show it to you.
Or you could give the benefit of the doubt and just accept that perhaps there's a tiny possibility that my claim is true.
Don't be like that.
My opinion is my opinion. You cannot tell me not to have my opinion.