1
0
Fork 0
mirror of https://github.com/sockspls/badfish synced 2025-04-30 08:43:09 +00:00
Commit graph

1367 commits

Author SHA1 Message Date
Marco Costalba
43276cbec5 Fix a bug in insert_pv() where minimum depth is zero
We implicitly considered the minimum depth stored in TT
to be Depth(0), but because we store values in TT also in
qsearch() where depth is < 0, we need to use a negative
number as minimum depth.

Bug spotted by Joona Kiiski.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-23 15:30:20 +01:00
Marco Costalba
a9e55d4326 Revert odd depths razoring
I have just made a new rule that no modification
that increases pruning is allowed if after 1000 games
ELO is not increased by at least 10 point (was +5 in this case)

Yes, I like this kind of nosense rules :-)
2009-03-23 12:02:15 +01:00
Marco Costalba
4d70e3aeac More aggressive dynamic LMR
Previous setup didn't change anything

After 996 games 1+0: +267 -261 =468 +2 ELO

Now with this new setup we have

After 999 games 1+0: +277 -245 =477 +11 ELO

Seems reasonable...

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-23 11:58:28 +01:00
Marco Costalba
24b7ad54c7 LMR dynamic reduction
Reduce of two plies near the leafs and when we still
have enough depth to go so to limit horizon effects.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 23:53:22 +01:00
Marco Costalba
0ff3bf34cd Always print a best move when requested
Little fix merged from iPhone Glaurung.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 23:53:10 +01:00
Marco Costalba
66d165921d Better castle move detector in move_to_san()
Merged from iPhone Glaurung.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 23:52:59 +01:00
Marco Costalba
b82c3021fa Fix a smal bug in Position::from_fen
We could fail to parse an en-passant position
in same cases.

Merged from iPhone Glaurung.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 23:52:23 +01:00
Marco Costalba
320630ca1a Merge new pawn storm evaluation
More accuracy in pawn storm evaluation
directly from iPhone Glaurung.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 23:52:06 +01:00
Marco Costalba
ef8acdc73b Fix a small bug in king safety
Merged from iPhone Glaurung.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 13:11:24 +01:00
Marco Costalba
cc8e915ed5 Merge KBPP vs KB endgame from iPhone Glaurung
Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-22 13:06:29 +01:00
Marco Costalba
16abc165d8 Fix: In qsearch do not use TT value when in a PV node
We already do like this in search_pv(), so extend
also in qsearch().

Bug spotted by Joona Kiiski.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-21 14:51:31 +01:00
Marco Costalba
74160ac602 Big headers cleanup
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-19 12:55:32 +01:00
Marco Costalba
feb5342b39 Safe guard some wild and ugly casts
These casts are needed but plain ugly, at least be
sure they don't hide any subtle conversion bug.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-16 13:59:41 +01:00
Marco Costalba
6cddf9183c Partially revert pawns storm bug fix
Try to save space and use the minimum size
possible.

In particular restore int16_t for values and int8_t
for halfOpenFiles.

Use int16_t for storm values insted of int and also
instead of original buggy and too small int8_t.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-16 08:02:33 +01:00
Marco Costalba
b870f5a091 Silence a good bunch of Intel warnings
Note that some pawns and material info has been switched
to int from int8_t.

This is a waste of space but it is not clear if we have a
faster or slower code (or nothing changed), some test should be
needed.

Few warnings still are alive.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-15 18:19:08 +01:00
Marco Costalba
fcecc5212e Fix an overflow bug in pawns stormValue
These fields are defined as int8_t but values bigger
then 127 are stored there so that we silently overflow.

Fix bringing up all the fields to a sane int type. This
will increase memory usage, but apart from being safe, it is
not clear if code is slower or faster. Test is needed.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-15 18:18:56 +01:00
Marco Costalba
8de91be61e Fix a silly warning on Intel compiler
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-15 18:18:42 +01:00
Marco Costalba
ca4e78db8d Revert NULL move beta corrections
After testing result is bad -25 ELO
2009-03-15 16:44:12 +01:00
Marco Costalba
bfcfaf7101 Retire Null Driven IID
It does not seem to clearly improve things and
in any case is disabled by default, so retire for now.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-15 16:43:28 +01:00
Marco Costalba
c8773c720a Merge Joona Kiiski NULL search beta correction
Prune more moves after a null search because of
a lower beta limit then in main search.

In test positions reduces the searched nodes of 30% !!!!

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-14 13:00:22 +01:00
Marco Costalba
3ed603cd64 Merge Joona Kiiski evaluation tweaks
Merge tewaks to many evaluation parameters
by Joona Kiiski.

After test they seem good!

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-14 12:55:14 +01:00
Marco Costalba
f637ddc1e8 Micro optimize move_is_check()
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-07 21:05:49 +01:00
Marco Costalba
964bd86272 Micro optimize pl_move_is_legal()
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-07 21:05:31 +01:00
Marco Costalba
72b88e09e1 Micro optimize previous patch
Also remove some Intel warnings, not all :-(

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-06 20:47:19 +01:00
Marco Costalba
3e663d8c50 Introduce evaluate_pieces<>() to remove redundancy
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-06 19:19:45 +01:00
Marco Costalba
8c9c51c721 Fix compile error with inlines under gcc and Intel
It seems that these compilers do not like inline functions
that call a template when template definition is not in scope.

So move functions from header to in *.cpp file

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-05 00:38:45 +01:00
Marco Costalba
7fe1632a49 Fix some comments in position.cpp
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-04 22:51:20 +01:00
Marco Costalba
7c84b39a42 Avoid to call useless sliders attacks in update_checkers()
Quickly filter out some calls to sliders attacks
when we already know that will fail for sure.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-04 22:51:04 +01:00
Marco Costalba
68e711aac6 Super fast hidden_checkers()
Rewritten hidden_checkers() to avoid calling
sliders attacks functions but just a much
faster squares_between()

Also a good code semplification.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-04 22:50:51 +01:00
Marco Costalba
02cd96e4c2 Cleanup SearchStack initialization
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-03 21:01:00 +01:00
Marco Costalba
772a37cd54 Micro optimize copy of new state in do_move()
Instead of copying all, copy only the fields that
are updated incrementally, not the ones that are
recalcuated form scratch anyway.

This reduces copy overhead of 30%.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-02 18:00:42 +01:00
Marco Costalba
c02613860a Revert hidden checkers rework
It is slower the previous uglier but faster code.

So completely restore old one for now :-(

Just leave in the rework of status backup/restore in do_move().

We will cherry pick bits of previous work once we are sure
we have fixed the performance regression.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-03-02 16:20:00 +01:00
Marco Costalba
9b6b9e67fe Use checker info to remove a bunch of hidden checks updates
Another powerful condition let us remove a big chunk of
useless updates.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-28 18:43:36 +01:00
Marco Costalba
6a8cfe79da Stricter condition to check for dc candidates
Another optimization that let us remove another half
of find_hidden_checks(them, DcCandidates) calls.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-28 18:43:20 +01:00
Marco Costalba
1f97b48a31 Split calculation of pinners from dc candidates
This let us to calculate only pinners when we now that
dc candidates are not possible.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-28 18:43:02 +01:00
Marco Costalba
a96cba0ec8 Slightly better condition in update_hidden_checks()
Use a more strict condition to check if we have captured
an opponent pinner or hidden checker.

With this patch the occurrence of checkerCaptured == true are
reduced of 50%.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-28 18:42:51 +01:00
Marco Costalba
f9f30412e7 Compute pinned and friends incrementally
In do_move() use previous pinned bitboards values to compute
the new one after the move. In particulary we end up with the
same bitboards in most cases. So detect these cases and just
keep the old values.

This should speedup a lot this slow computation in a very hot
path so that we can use this important info everywhere in the
code at very cheap cost.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-28 18:42:30 +01:00
Marco Costalba
55f9afee2a Fix a subtle bug due to the StateInfo pointer became stale
There was one occurence when the StateInfo variable went
out of scope before the corresponding Position object.

This yelds to a crash. Bug was not hit before because occurs
only when using an UCI interface and not the usual benchmark.

The fix consists in copying internally the content of the
about to stale StateInfo.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:45:12 +01:00
Marco Costalba
962216440c Teach SEE about pinned pieces
Remove pinned pieces from attacks when calculating
SEE value.

Algorithm is not perfect, there should be no false
positives, but can happen that we miss to remove a
pinned piece. Currently we don't cach 100% of cases,
but is a tradeoff between speed and accuracy. In any
case we stay on the safe side, so we remove an attacker
when we are sure it is pinned.

About only 0,5% of cases are affected by this patch, not
a lot given the hard work: this is a difficult patch!

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:45:01 +01:00
Marco Costalba
243fa483d7 Small Position::clear() cleanup
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:44:46 +01:00
Marco Costalba
da7a62852a Do not copy the whole old state in do_move()
Instead of copy the old state in the new one, copy only
fields that will be updated incrementally, not the ones
that will be recalculcated anyway.

This let us copy 13 bytes instead of 28 for each do_move()
call.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:44:29 +01:00
Marco Costalba
0bf45823da Update pinned bitboards and friends in do_move()
Probably is slightly slow, but code is surely better
in this way. We will optimize later for speed.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:44:17 +01:00
Marco Costalba
4324276419 Fix some asserts unhidden by a debug compile
Fallback form previous patches.
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-23 21:44:04 +01:00
Marco Costalba
43bc5479c2 Avoid resetting pinners[c]
Small optimization. No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:19:04 +01:00
Marco Costalba
8f59de48f5 Introduce StateInfo instead of UndoInfo
We don't backup anymore but use the renamed StateInfo
argument passed in do_move() to store the new position
state when doing a move.

Backup is now just revert to previous StateInfo that we know
because we store a pointer to it.
Note that now backing store is up to the caller, Position is
stateless in that regard, state is accessed through a pointer.

This patch will let us remove all the backup/restore copying,
just a pointer switch is now necessary.

Note that do_null_move() still uses StateInfo as backup.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:18:50 +01:00
Marco Costalba
2f6c5f00e6 Wrap state variables in a named struct
This will allow us to more easily move the state
out of Position class.

No functioanl change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:18:35 +01:00
Marco Costalba
2f21ec39ad Convert also undo_null_move() to avoid passing UndoInfo object
No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:18:14 +01:00
Marco Costalba
9b257ba29d Passing UndoInfo is not needed anymore when undoing the move
We store it now in the same UndoInfo struct as 'previous'
field, so when doing a move we also know where to get
the previous info when undoing the back the move.

This is needed for future patches and is a nice cleanup anyway.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:18:02 +01:00
Marco Costalba
1b0fee9b17 Remove two useless calls to pinned_pieces()
Are obsoleted by new pinned caching code.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-22 21:17:44 +01:00
Marco Costalba
b7cb6180cf Position: Unify and templetize mg_pst() and eg_pst()
Also templetize compute_value() can be simpler now that
the above is templetized too.

No functional change.

Signed-off-by: Marco Costalba <mcostalba@gmail.com>
2009-02-20 22:50:35 +01:00