Discussion:
freeradius crashing
8zero2 operations
2018-11-06 08:04:45 UTC
Permalink
Hi,

Here is a trace of crash

Starting program: /usr/local/sbin/radiusd -fxx

[Thread debugging using libthread_db enabled]

Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".

[New Thread 0x7fffebbf8700 (LWP 116866)]

[New Thread 0x7fffeb3f7700 (LWP 116867)]

[New Thread 0x7fffeabf6700 (LWP 116868)]

[New Thread 0x7fffea3f5700 (LWP 116869)]

[New Thread 0x7fffe9bf4700 (LWP 116870)]

[New Thread 0x7fffe93f3700 (LWP 116871)]

[New Thread 0x7fffe8bf2700 (LWP 116872)]

[New Thread 0x7fffe83f1700 (LWP 116873)]

[New Thread 0x7fffe7bf0700 (LWP 116874)]

[New Thread 0x7fffe73ef700 (LWP 116875)]

[New Thread 0x7fffe6bee700 (LWP 116876)]

[New Thread 0x7fffe63ed700 (LWP 116877)]

[New Thread 0x7fffe5bec700 (LWP 116878)]

[New Thread 0x7fffe53eb700 (LWP 116879)]

[New Thread 0x7fffe4bea700 (LWP 116880)]

[New Thread 0x7fffe43e9700 (LWP 116881)]

[New Thread 0x7fffe3be8700 (LWP 116882)]

[New Thread 0x7fffe33e7700 (LWP 116883)]

[New Thread 0x7fffe2be6700 (LWP 116884)]

[New Thread 0x7fffe23e5700 (LWP 116885)]

[New Thread 0x7fffe1be4700 (LWP 116886)]

[New Thread 0x7fffe13e3700 (LWP 116887)]

[New Thread 0x7fffe0be2700 (LWP 116888)]

[New Thread 0x7fffe03e1700 (LWP 116889)]

[New Thread 0x7fffdfbe0700 (LWP 116890)]

[New Thread 0x7fffdf3df700 (LWP 116891)]

[New Thread 0x7fffdebde700 (LWP 116892)]

[New Thread 0x7fffde3dd700 (LWP 116893)]

[New Thread 0x7fffddbdc700 (LWP 116894)]

[New Thread 0x7fffdd3db700 (LWP 116895)]

[New Thread 0x7fffdcbda700 (LWP 116896)]

[New Thread 0x7fffdc3d9700 (LWP 116897)]

[New Thread 0x7fffdbbd8700 (LWP 116898)]

[New Thread 0x7fffdb3d7700 (LWP 116899)]

[New Thread 0x7fffdabd6700 (LWP 116900)]

[New Thread 0x7fffda3d5700 (LWP 116901)]

[New Thread 0x7fffd9bd4700 (LWP 116902)]

[New Thread 0x7fffd93d3700 (LWP 116903)]

[New Thread 0x7fffd8bd2700 (LWP 116904)]

[New Thread 0x7fffd83d1700 (LWP 116905)]

[New Thread 0x7fffd7bd0700 (LWP 116906)]

[New Thread 0x7fffd73cf700 (LWP 116907)]

[New Thread 0x7fffd6bce700 (LWP 116908)]

[New Thread 0x7fffd63cd700 (LWP 116909)]

[New Thread 0x7fffd5bcc700 (LWP 116910)]

[New Thread 0x7fffd53cb700 (LWP 116911)]

[New Thread 0x7fffd4bca700 (LWP 116912)]

[New Thread 0x7fffd43c9700 (LWP 116913)]

[New Thread 0x7fffd3bc8700 (LWP 116914)]

[New Thread 0x7fffd33c7700 (LWP 116915)]

[New Thread 0x7fffd2bc6700 (LWP 118007)]

[New Thread 0x7fffd23c5700 (LWP 118008)]

[New Thread 0x7fffd1bc4700 (LWP 118009)]

[New Thread 0x7fffd13c3700 (LWP 118010)]


Thread 34 "radiusd" received signal SIGSEGV, Segmentation fault.

[Switching to Thread 0x7fffdbbd8700 (LWP 116898)]

0x0000000000442c62 in insert_into_proxy_hash (request=0x7fff500039d0)

at src/main/process.c:2303

2303 request->home_server->currently_outstanding++;

Id Target Id Frame


32 Thread 0x7fffdcbda700 (LWP 116896) "radiusd" 0x00007ffff6c74827 in
futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0,

futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

33 Thread 0x7fffdc3d9700 (LWP 116897) "radiusd" 0x00007ffff6c74827 in
futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0,

futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

* 34 Thread 0x7fffdbbd8700 (LWP 116898) "radiusd" 0x0000000000442c62 in
insert_into_proxy_hash (request=0x7fff500039d0) at src/main/process.c:2303

35 Thread 0x7fffdb3d7700 (LWP 116899) "radiusd" 0x00007ffff6c74827 in
futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0,

futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

36 Thread 0x7fffdabd6700 (LWP 116900) "radiusd" 0x00007ffff6c74827 in
futex_abstimed_wait_cancelable (private=0, abstime=0x0, expected=0,

futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205




Thread 38 (Thread 0x7fffd9bd4700 (LWP 116902)):

#0 0x00007ffff6c74827 in futex_abstimed_wait_cancelable (private=0,

abstime=0x0, expected=0, futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

__ret = -512

oldtype = 0

err = <optimized out>

#1 do_futex_wait (sem=***@entry=0x6873e8 <thread_pool+168>, abstime=0x0)

at sem_waitcommon.c:111

No locals.

#2 0x00007ffff6c748d4 in __new_sem_wait_slow (sem=0x6873e8
<thread_pool+168>,

abstime=0x0) at sem_waitcommon.c:181

_buffer = {__routine = 0x7ffff6c747e0 <__sem_wait_cleanup>,

__arg = 0x6873e8 <thread_pool+168>, __canceltype = 0, __prev =
0x0}

err = <optimized out>

d = 227633266688

#3 0x00007ffff6c7497a in __new_sem_wait (sem=<optimized out>) at
sem_wait.c:29

No locals.

#4 0x000000000043c8ae in request_handler_thread (arg=0xb68680)

at src/main/threads.c:755

self = 0xb68680

#5 0x00007ffff6c6c6ba in start_thread (arg=0x7fffd9bd4700)

at pthread_create.c:333

__res = <optimized out>

pd = 0x7fffd9bd4700

now = <optimized out>

unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736846448384,

-8854724666266409425, 0, 140737488346975, 140736846449088,
0,

8854641668176815663, 8854744653752641071},

mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0},

data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}

not_first_call = <optimized out>

pagesize_m1 = <optimized out>

sp = <optimized out>

freesize = <optimized out>

__PRETTY_FUNCTION__ = "start_thread"

#6 0x00007ffff676a41d in clone ()

at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109

No locals.


Thread 37 (Thread 0x7fffda3d5700 (LWP 116901)):

#0 0x00007ffff6c74827 in futex_abstimed_wait_cancelable (private=0,

abstime=0x0, expected=0, futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

__ret = -512

oldtype = 0

err = <optimized out>

#1 do_futex_wait (sem=***@entry=0x6873e8 <thread_pool+168>, abstime=0x0)

at sem_waitcommon.c:111

No locals.

#2 0x00007ffff6c748d4 in __new_sem_wait_slow (sem=0x6873e8
<thread_pool+168>,

abstime=0x0) at sem_waitcommon.c:181

_buffer = {__routine = 0x7ffff6c747e0 <__sem_wait_cleanup>,

__arg = 0x6873e8 <thread_pool+168>, __canceltype = 0, __prev =
0x0}

err = <optimized out>

d = 214748364800

#3 0x00007ffff6c7497a in __new_sem_wait (sem=<optimized out>) at
sem_wait.c:29

No locals.

#4 0x000000000043c8ae in request_handler_thread (arg=0xb67ea0)

at src/main/threads.c:755

self = 0xb67ea0

#5 0x00007ffff6c6c6ba in start_thread (arg=0x7fffda3d5700)

at pthread_create.c:333

__res = <optimized out>

pd = 0x7fffda3d5700

now = <optimized out>

unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736854841088,

-8854724666266409425, 0, 140737488346975, 140736854841792,
0,

8854647161976858159, 8854744653752641071},

mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0},

data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}

not_first_call = <optimized out>

pagesize_m1 = <optimized out>

sp = <optimized out>

freesize = <optimized out>

__PRETTY_FUNCTION__ = "start_thread"

#6 0x00007ffff676a41d in clone ()

at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109

No locals.


Thread 36 (Thread 0x7fffdabd6700 (LWP 116900)):

#0 0x00007ffff6c74827 in futex_abstimed_wait_cancelable (private=0,

abstime=0x0, expected=0, futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

__ret = -512

oldtype = 0

err = <optimized out>

#1 do_futex_wait (sem=***@entry=0x6873e8 <thread_pool+168>, abstime=0x0)

at sem_waitcommon.c:111

No locals.

#2 0x00007ffff6c748d4 in __new_sem_wait_slow (sem=0x6873e8
<thread_pool+168>,

abstime=0x0) at sem_waitcommon.c:181

_buffer = {__routine = 0x7ffff6c747e0 <__sem_wait_cleanup>,

__arg = 0x6873e8 <thread_pool+168>, __canceltype = 0, __prev =
0x0}

err = <optimized out>

d = 227633266688

#3 0x00007ffff6c7497a in __new_sem_wait (sem=<optimized out>) at
sem_wait.c:29

No locals.

#4 0x000000000043c8ae in request_handler_thread (arg=0xb67770)

at src/main/threads.c:755

self = 0xb67770

#5 0x00007ffff6c6c6ba in start_thread (arg=0x7fffdabd6700)

at pthread_create.c:333

__res = <optimized out>

pd = 0x7fffdabd6700

now = <optimized out>

unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736863233792,

-8854724666266409425, 0, 140737488346975, 140736863234496,
0,

8854648262025356847, 8854744653752641071},

mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0},

data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}

not_first_call = <optimized out>

pagesize_m1 = <optimized out>

sp = <optimized out>

freesize = <optimized out>

__PRETTY_FUNCTION__ = "start_thread"

#6 0x00007ffff676a41d in clone ()

at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109

No locals.


Thread 35 (Thread 0x7fffdb3d7700 (LWP 116899)):

#0 0x00007ffff6c74827 in futex_abstimed_wait_cancelable (private=0,

abstime=0x0, expected=0, futex_word=0x6873e8 <thread_pool+168>)

at ../sysdeps/unix/sysv/linux/futex-internal.h:205

__ret = -512

oldtype = 0

err = <optimized out>

#1 do_futex_wait (sem=***@entry=0x6873e8 <thread_pool+168>, abstime=0x0)

at sem_waitcommon.c:111

No locals.

#2 0x00007ffff6c748d4 in __new_sem_wait_slow (sem=0x6873e8
<thread_pool+168>,

abstime=0x0) at sem_waitcommon.c:181

_buffer = {__routine = 0x7ffff6c747e0 <__sem_wait_cleanup>,

__arg = 0x6873e8 <thread_pool+168>, __canceltype = 0, __prev =
0x0}

err = <optimized out>

d = 219043332096

#3 0x00007ffff6c7497a in __new_sem_wait (sem=<optimized out>) at
sem_wait.c:29

No locals.

#4 0x000000000043c8ae in request_handler_thread (arg=0xb671b0)

at src/main/threads.c:755

self = 0xb671b0

#5 0x00007ffff6c6c6ba in start_thread (arg=0x7fffdb3d7700)

at pthread_create.c:333

__res = <optimized out>

pd = 0x7fffdb3d7700

now = <optimized out>

unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736871626496,

-8854724666266409425, 0, 140737488346975, 140736871627200,
0,

8854644964027344431, 8854744653752641071},

mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0},

data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}

not_first_call = <optimized out>

pagesize_m1 = <optimized out>

sp = <optimized out>

freesize = <optimized out>

__PRETTY_FUNCTION__ = "start_thread"

#6 0x00007ffff676a41d in clone ()

at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109

No locals.


Thread 34 (Thread 0x7fffdbbd8700 (LWP 116898)):

#0 0x0000000000442c62 in insert_into_proxy_hash (request=0x7fff500039d0)

at src/main/process.c:2303

buf = '\000' <repeats 48 times>,
"x|\275\333\230\000\000\000\000\234\240F\367&\270\037\220|\275\333\377\177\000\000\212\214B\000\000\000\000\000\320\071\000P\377\177\000\000\000\000\000\000\005\000\000\000\000\000\000\000\a",
'\000' <repeats 11 times>, "`^\256", '\000' <repeats 12 times>

tries = 0

success = true

proxy_listener = 0xb8a510

#1 0x0000000000448396 in request_coa_originate (request=0xb98270)

at src/main/process.c:4266

rcode = 7

pre_proxy_type = 0

vp = 0x0

coa = 0x7fff500039d0

ipaddr = {af = 2, ipaddr = {ip4addr = {s_addr = 1275110922},

ip6addr = {__in6_u = {

__u6_addr8 = "\n\246\000L", '\000' <repeats 11 times>,

__u6_addr16 = {42506, 19456, 0, 0, 0, 0, 0, 0}, __u6_addr32
= {

1275110922, 0, 0, 0}}}}, prefix = 32 ' ', scope = 0}

buffer =
"\000\207\275\333\377\177\000\000\000\234\240F\367&\270\037\240}\275\333\377\177\000\000\355l\226\367\377\177\000\000\260}\275\333\377\177\000\000pE\000P\377\177\000\000\340}\275\333\243\000\000\000\320\t\230\367\377\177\000\000\031\000\000\000S\000\000\000\270\321\203\000\000\000\000\000\220\236\271\000\000\000\000\000
z\202\000\000\000\000\000\004\000\000\000\000\000\000\000pE\000P\377\177\000\000\000\000\000\000\000\000\000\000\020~\275\333\377\177\000\000\340}\275\333\377\177\000\000.4\225\367\377\177\000\000\340}\275\333\377\177\000\000\315\061\225\367\200\177\000\000\000\000\000\000\070\004\000\000\020~\275\333\377\177\000\000\b~\275\333\377\177\000\000\000\000\000\000\000\000\000\000@
~\275\333\377\177\000\000"...

#2 0x0000000000440ad3 in request_finish (request=0xb98270, action=1)

at src/main/process.c:1376

vp = 0x0

#3 0x000000000044128f in request_running (request=0xb98270, action=1)

at src/main/process.c:1604

__FUNCTION__ = "request_running"

#4 0x000000000043ca0b in request_handler_thread (arg=0xb66d60)

at src/main/threads.c:826

self = 0xb66d60

#5 0x00007ffff6c6c6ba in start_thread (arg=0x7fffdbbd8700)

at pthread_create.c:333

__res = <optimized out>

pd = 0x7fffdbbd8700

now = <optimized out>

unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736880019200,

-8854724666266409425, 0, 140737488346975, 140736880019904,
0,

8854646064075843119, 8854744653752641071},

mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0},

data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}

not_first_call = <optimized out>

pagesize_m1 = <optimized out>

sp = <optimized out>

freesize = <optimized out>

__PRETTY_FUNCTION__ = "start_thread"

#6 0x00007ffff676a41d in clone ()

at ../sysdeps/unix/sysv/linux/x86_64/clone.S:109

No locals.



Regards,
Mail: ***@gmail.com
Facebook: www.facebook.com/8zero2
Twitter: @8zero2_in
Blog: blog.8zero2.in
-
List info/subscribe/unsubscribe? See http://www.freeradius.org
Alan DeKok
2018-11-06 12:47:15 UTC
Permalink
Post by 8zero2 operations
Here is a trace of crash
Which version of FreeRADIUS is this? That might help...

Alan DeKok.

-
List info/subscribe/unsubscribe? See http://www.
8zero2 operations
2018-11-07 19:16:09 UTC
Permalink
v3.0.x

Regards,
Mail: ***@gmail.com
Facebook: www.facebook.com/8zero2
Twitter: @8zero2_in
Blog: blog.8zero2.in
Post by Alan DeKok
Post by 8zero2 operations
Here is a trace of crash
Which version of FreeRADIUS is this? That might help...
Alan DeKok.
-
List info/subscribe/unsubscribe? See
http://www.freeradius.org/list/users.html
-
List info/subscribe/unsubscribe? See
8zero2 operations
2018-11-08 04:36:27 UTC
Permalink
In logs the following is shown with crash

ASSERT FAILED src/main/util.c[568]: !request->in_proxy_hash

Regards,
Mail: ***@gmail.com
Facebook: www.facebook.com/8zero2
Twitter: @8zero2_in
Blog: blog.8zero2.in
Post by 8zero2 operations
v3.0.x
Regards,
Facebook: www.facebook.com/8zero2
Blog: blog.8zero2.in
Post by Alan DeKok
Post by 8zero2 operations
Here is a trace of crash
Which version of FreeRADIUS is this? That might help...
Alan DeKok.
-
List info/subscribe/unsubscribe? See
http://www.freeradius.org/list/users.html
-
List info/subscribe/unsubscribe? See http:/
Alan DeKok
2018-11-11 15:18:48 UTC
Permalink
Post by 8zero2 operations
In logs the following is shown with crash
ASSERT FAILED src/main/util.c[568]: !request->in_proxy_hash
Hmm... some additional information may help. Such as the debug log from when it crashes. Or, a description of your configuration.

People are running v3.0.x with lots of proxying, so we know that most configurations work. The question here is what's different about your configuration?

Alan DeKok.


-
List info/subscribe/unsubscribe? See http://www.freeradius.org/li
8zero2 operations
2018-11-19 10:33:27 UTC
Permalink
Hi,

Apologies for late response. Problem only comes in threaded mode. No
problems in debug mode.

I am not even using proxy in the configurations. I have lots of unlang in
accounting and post-auth and some php based exec modules.

Regards,
Mail: ***@gmail.com
Facebook: www.facebook.com/8zero2
Twitter: @8zero2_in
Blog: blog.8zero2.in
Post by Alan DeKok
Post by 8zero2 operations
In logs the following is shown with crash
ASSERT FAILED src/main/util.c[568]: !request->in_proxy_hash
Hmm... some additional information may help. Such as the debug log from
when it crashes. Or, a description of your configuration.
People are running v3.0.x with lots of proxying, so we know that most
configurations work. The question here is what's different about your
configuration?
Alan DeKok.
-
List info/subscribe/unsubscribe? See
http://www.freeradius.org/list/users.html
-
List info/subscribe/unsubscribe? See h
Adam Bishop
2018-11-19 10:40:14 UTC
Permalink
Post by 8zero2 operations
Apologies for late response. Problem only comes in threaded mode. No
problems in debug mode.
You can attach to the running process with radmin and run debugging from there if it's not crashing with 'radiusd -X'.

Try starting radmin, then run 'debug level 4'.

https://freeradius.org/radiusd/man/radmin.html

Adam Bishop

gpg: E75B 1F92 6407 DFDF 9F1C BF10 C993 2504 6609 D460

jisc.ac.uk

Jisc is a registered charity (number 1149740) and a company limited by guarantee which is registered in England under Company No. 5747339, VAT No. GB 197 0632 86. Jisc’s registered office is: One Castlepark, Tower Hill, Bristol, BS2 0JA. T 0203 697 5800.

Jisc Services Limited is a wholly owned Jisc subsidiary and a company limited by guarantee which is registered in England under company number 2881024, VAT number GB 197 0632 86. The registered office is: One Castle Park, Tower Hill, Bristol BS2 0JA. T 0203 697 5800.


-
List info/subscribe/uns

Loading...